Frederic Filloux, co-editor of the Monday Note, along with Jean-Louis Gassée, dedicated last Monday’s note to the increasing bulk of Web page code, lamenting the fact that a single page of text requires between 6 and 55 pages of code.
As an illustration, Filloux used an article from the venerable British daily, The Guardian, on the banning of the Russian athletes from the Rio Paralympic Games.
The article contained 757 words, or 4,667 useful characters and spaces, while the underlying code contained 485,527 characters. This means that the final product weighed less than one per cent (0.96%) of the code, and even less if you count the external resources used, like scripts, CSS files, fonts and images.
Part of this bloating is due to the generous helpings of peripheral links and code used for marketing and advertising purposes such as tracking, statistics, or ad display. Media sites are especially fond of them, often loading dozens of trackers (JavaScript code to track and record user activity) per page.
As a reference point, in 1991, one of the first texts ever published on the Web, World Wide Web Summary, written in HMTL by its inventor, Tim Berners-Lee, had just under 4,200 characters of readable text for under 4,600 characters of code. Of course, the Web page was minimalistic, but still, the content to container ratio was 90.5%, compared to 0.96% for the Guardian in 2016.
Last year, Maciej Cegłowski gave a great talk on “The Website Obesity Crisis” at the Web Directions seminar in Sidney.
(Any consciencious Web professional has a duty to watch this excellent 55-minute presentation.)
Maciej noted that this tendency to bulk up was not new. Joshua Bixby raised the alarm back in 2012, when the average Web page (imports included) tipped the scale at 1 MB (1,042 kb, to be more precise), while at the same time, the main lookup peripherals were getting ever slower, smaller and less interactive (here we’re referring to mobile phones, of course). Joshua predicted that in 2015, your average page would weigh in at 2,344 kb.
The Guardian page weighs 3,420 kb, or 7,720 kb, if you deactivate the ad blocker, which means 19 minutes of loading time with a 56k modem and 7 minutes and 45 seconds in ISDN-128.
“While bigger pages hurt performance for desktop users, too, the biggest victims of page bloat are mobile users. Not only does a 1 MB page take forever to load, it can also deliver a nasty case of sticker shock when you get your phone bill. To give you a for-instance, earlier this month I was travelling in Europe. Before I left home, I bought 25 MB of data from my provider for $100. In other words, I’m paying $4 per page.” — Joshua Bixby, 2012.
While the price of mobile phone service has since dropped, the weight of pages is on the rise, meaning we must prepare for 5 MB pages by 2020. And the heavier the page, the slower to load, the unhappier the user and the more expensive the Web: servers must be let out, cables widened, optimisation re-optimised… not to mention that any potential advantages of this bloating are meagre, and downright non-existent for users.
Today, the rule of thumb is that pages should not weigh more than 2 MB and not take more than 2 or 3 seconds to load with a good connection. You are reading this article on a page that weighs 5,910 kb for 93 queries (fat cats pictures count for much), meaning that I waited 4.2 seconds for it to load. We can probably do better; we’re working on it.
For commercial sites, poor performance is related to increased bouncing, a decrease in the number of pages seen by visitors and, worse, a drop in conversion rates. Pages that take more than 7 to 8 seconds to load are said to cross the “performance poverty line”. This means that if you have something to sell, you might as well pack it up.
For instance, Walmart increases its conversion rate by 2% for every second gained. That’s just a few millions dollars per second…
I believe that any Web project should end with a quality-control “weight-loss camp”, where each and every line of code and import (images, CSS, fonts, js, etc.) is reviewed and called into question. Better yet, work could be done upstream to ensure that frameworks are not automatically used when not absolutely necessary. Another idea would be to throttle the Internet connections of developers and Web site managers, to give them a taste of their own medecine and get them to understand that not everyone, even in rich countries, has a fantabulous Internet connection.