Shared publicly  - 
We all know how important speed is when comes to websites. But over the past couple of years I have noticed a disturbing trend. Due to how we are all trying to create multi-platform websites, web developers are turning to ready-built frameworks and templates. 

The problem is that while these frameworks do make it easier (and faster) to code, you are only using a fraction of the code. The result is bloatware in which your site ends up being terribly slow.

And I see this with so many sites. Look at the example below. It a page speed test from one our our national newspapers, loading just one page (this one:

The page itself is nothing special. Sure, it contains a lot of images and links, just like any other page on a newspaper site, but there is no super fancy interactivity. It's very basic stuff. 

On the image below you can see how it's actually loading 60 javascript files, but that' just in the header. The page loads a total of 80 script files. 80!! This is combined with a total of 312 requests just to load one page.

The result of this is that the page loads rather slow, of which 69% is due to the scripts alone. The total size of the scripts are 0.7 MB + another 0.6 MB for other elements that couldn't be identified.

And, of the total size of the page (which is a staggering 2.2MB) more than half is coming from non-page sources.

To put into perspective just how much code this page is including, think of it like this:

The article itself is 2,072 characters / 328 words long, but the javascript code is:

- 552,620 characters long
- 79,445 words long
(Harry Potter and the Philosopher's Stone was 76,944 words long) 
- And if you decided to print just the javascript code alone, it takes up 366 pages in US letter, or 350 pages in A4.

...and that's just the javascript. You are forcing people to load an entire book of javascript with every article.

I see this problem on so many sites. Web designers have become lazy (or too busy). Instead of thinking about what frameworks they use, they just keep adding them. And of those frameworks added, they often only use 1% of what the code is designed to do. Often most of the functionality is only needed on a very few pages but are included on every single page.

Be smarter!
Jeremy Morgan's profile photoTerry Saunders's profile photoJack Tsu's profile photomal disposto's profile photo
+Thomas Baekdal thanks for highlighting this issue. There are a lot of problems around this, and Google is not blameless either for their own stuff.
+Thomas Vackier JQuery is actually one of the lesser offenders here. If that's all you used, you'd be fine.
There are tools to help you learn which pieces are used for any particular page, and theoretically, you could construct a single JS or CSS file that only contains the pieces you need. Easier said than done though. Hopefully someone will solve this. We really need a preprocessing step during publishing.
BTW, when it comes to an extremely common library such as JQuery, if you simply include it in the header as sourced from, there is a 90%+ chance that the user's browser already has the file cached.

So it really only (down)loads it in rare circumstances, and should often already have it loaded into memory as well, since other browser tabs open already did.

For less common libraries/frameworks, this of course would not apply, and this is not to take away from the overall issue of code bloat, especially when introduced through adware/adtech, Google Analytics, or anything else that sites falsely (or on purpose) do not execute at lest as asynchronous (in the HTML footer after </body>, or with "document.onload( [all JS stuff here...])"
I don't really want to excuse the example in terms of likely needless code bloat -- but shouldn't the 60 sources problem largely be mitigated by any decent framework handling bundling & minification?

You make your tradeoffs and pick your poison -- either script source from the best CDNs available and hope you get your 304  ... bundle it all up together and let gzip do its magic... hopefully from good static hosting.  Problem largely handled save my initial page load that might be slow.  Even that should be mitigated if you build a good home page that pulls the resources in to pre-emptively cache them for the pages that will use them later.

I get more offended by ad networks injecting slow scripts that block page loads while they perform slower reverse dns, and even slower content loads while they figure out what demographic to serve.  A single network script source tends to pull in 10 more, and the whole page stalls.

Sure, you're pulling down a whole book still -- but if you designed the site right there's no reason to have ever fetched it more than once.
is that true +Alex Schleber I always assumed that the browser did not look at the cache from other domains. Haven't tested that so I think I will.
+Jason Brown The problem with caching is that it's only relevant for repeat visitors. This might be the case for newspapers, but for brands most of the traffic is usually new visitors (either because they have lost their cache due to the time or using multiple devices... or simple not visiting the site before).

As for slow ads, read this:
The frameworks can be modified to only use js and css you need. There are ways to enhance performance, but it will come down to properly structuring content while still stimulating image hungry short attention spans
+Alex Schleber I'm trying to test that... but so far I'm getting 'Status: 200 OK' and not '304 not modified' on requests when loading it different domains.
+Thomas Baekdal  +Alex Schleber  That is kinda what I expected. Been a few years since I last dug through the browser cache but it was always related to the browser tab and the current domain that had focus. This is the way it treats the cookies for example. I would be surprised if it navigated through all the other caches just to see if it had this object from another domain.
If page load times matter for business (seo or sales), site owners usually invest in optimization. Not hard to reduce js + css at all, it's even in Google webmaster tools.

Anyway - if you're the only one who cares, support that newspaper and buy the print edition ;-) 
I have tried it both ways... same result. No matter what I do, I cannot get it to cache the jquery page from the google API across two sites. 
Okay... weird. Just tested it in IE 11 and it is doing cross-domain caching there. But it's not doing it in Chrome. 
If you use +Dart instead of JS then your code will get tree shaking and your footprint will be only the code you use!  Win.
Agreed. I tested a competitors site last weekend their home page was 7mb! That's just obscene, especially in a multi device world where mobile data usage is constantly on the rise.
+Simon Bear plus the extra runtime code that is added by the dart2js compiler. Still a win though.
Disregard - comment made in error.
I personally think using JQuery is better than trying to do stuff in the DOM yourself. It makes it easier to read and follow. Besides, the latest minimal JQuery.js is a whopping 82k, and your browser will be smart enough to cache it the first time it loads (unless you are some idiot that has a badly configured web server). Even if it doesn't, its one extra library to load, with no external references.

The reason why I say that 82k is not a big deal is that its a single request (and even at 1Mbps, its less than a second for transferring). By far the biggest "slow down" is the amount of requests your page has to make before displaying content, to request that single 82k item will probably take more like 3 seconds, just because of network latency to the website, overhead for opening sockets, negotiating http connections, etc, etc.

What I do not approve of, are seasoned web developers using huge frameworks like Joomla or Wordpress just to display a "hello" page. Now that is a poison, which should be reserved for hobbyists and people who make personal blogs.
Thanks for this research - it confirms something I've always been suspicious about but never went to investigate properly! When I was the webstats man in my last job I got pulled into my boss's office for spending too much time on the internet. Turns out the maximum allowed 'downloads' was 2GB a month, but this must have included browsing, and each Google Analytics page was about 1MB (caching aside), so... ;)
Great +Thomas Baekdal 
Definitely to be thought point.
But as a user of the internet, I would like to have faster internet, than to cut down on the scripting, which ultimately makes the web beautiful.
As a developer, I would like www to make few standards, which make scripting as a part of a browser, so that I dont even end up importing scripts at web page load. They must be pre-loaded.
Just a thought.
I like your post.
This is such am old issue. We live in the day of dual core cpus and 10mb networking and people still feel like beating the Bible in this one. You stay with a clean design then optimize as needed. Come on we've been doing the same stuff forever. 
+Jeremy Morgan The speed of a website is also effected by code parsing speed and rendering speed. Those are both controlled by the client pc.
Please.old webpage.close g new webpage.and new coockeys open
Sir.allof webpge pogram ya vidios are not see so and my problem solve.please sir
Add a comment...