Optimising your site delivery size and performance

06 July 2012
Leszek Ciesielski
Frink_Cognifide_2016_HeaderImages_0117
The geography of talk in Great Britain Most approach the topic of site optimisation either from an SEO or user experience perspective. However, when your site pushes terabytes of traffic and bills from your CDN provider account for thousands of pounds per day, I reckon every change you can make to minimize the delivery size will count. Let's look at the tools and practices that can decrease your hosting bill significantly while making your users happier at the same time.

Tools

If you are using a CDN - make sure you check the traffic statistics from your provider so that you know what assets account for the highest costs. Remember the Pareto principle: its very likely  that the top few items are contributing the bulk of the traffic. If you aren't using a CDN - you probably should sign up with one right now to avail cheaper hosting (and faster user experiences) instead of paying for the same amount of traffic going straight to your server.

No, don't lose heart if you do not have a CDN at the moment, as the data you're after can be also found combining web analytics information and statistics gathered by a page analysis tool. Two most popular ones are YSlow and Page Speed, both available as Firefox and Chrome extensions. Navigate to the suspected page, open Firebug, run the analysis and check the results. I have included a few screenshots that illustrate performance analysis results for an asset-heavy page, belonging to one of our clients, before the optimisation measure was undertaken to transfer almost 150 gigabytes of data per day through the CDN.

Reading the measurements

YSlow base statistics YSlow base grade Oh, this hurts. Approaching 3 megabytes per user and summary grade D suggests that this page needs some serious attention. As a rule of thumb, you should aim for grade A for the landing page, and at least C for subsequent content pages. Almost every site has a significant bounce rate on the initial page, so make sure that those fly-by users aren't costing you much. So, what can be done?

Optimisation steps

Here is a quick walk-through:

  1.  Identify the content that you don't need to host yourself. This might sound trivial, but perhaps you are embedding videos on your pages that you could easily upload to YouTube or you include jQuery (or one of multiple other popular JavaScript libraries) instead of relying on the free CDN Google provides for them? Fixing this not only reduces the amount of traffic you have to pay for yourself, it's also highly probable that the user's browser already has those files in cache.
  2. Check the images  you serve. Until Google's magical fits-all-use WebP image format becomes available in all browsers, you'll need JPEG for photographs and PNG for most computer-generated (i.e. large areas in the same colour) images. Make sure that those images are as small as possible - unless you are hosting a site dedicated to photography, there's a lot of room for optimisation here. PNG is a lossless format, so compress the hell out of it - use PNGGauntlet, which really works wonders. JPEG, on the other hand, is lossy, so there's no automatic tool here. But there's a large probability that the images that came with the layout templates were virtually uncompressed - and passed all the way into production with the quality settings still cranked up high. Use a tool that allows you to set Subsampling algorithm to 2x2 (GIMP/Photoshop - whatever you have available) and try changing the quality setting. Compare the original and resulting image side by side, check areas such as human faces or gradual colour changes for artifacts - if you find any, try again with higher quality settings. We've had cases where the background images were reduced from 1.8MB to 120KB, without a visible change in quality! You will have to experiment here, so focus on those images that every visitor has to download.
  3. Ensure your server is configured correctly - each asset should be served compressed and with Expires and ETag headers set. This doesn't help with initial load, but saves tons of data for the returning users.
  4. Combine and minimise your JS and CSS files. This easily gains you 10% of their size (and speeds up browser page rendering as a side effect). You can use a compile-time minifier, like the Microsoft Ajax Minifier (warning: works for Java projects too!) or a runtime tool (there will be one built into .NET 4.5 release). For anyone not yet using the latest .NET Developer Preview, Cognifide has an Assets Optimiser module, which will be released as Open Source soon™.

Results

Summary result of those changes on the sample site I have tested: YSlow optimised statistics YSlow optimised grade The delivery size is now 68% smaller, implying the page will download much faster and cheaper. All the listed improvements have been introduced, to the extent possible. The site is still getting an F for missing expiry headers because of external scripts served by Facebook, Twitter and Google Analytics - unfortunately those are factors outside our immediate control. Overall, the traffic volume has decreased to 41 gigabytes per day (starting from almost 150 gigabytes before the optimisations) while serving the same number of visitors! So, what would you like to buy with the thirty thousand pounds we just saved you per month or £360, 000 annually?!

Conclusion

Minimizing the delivery size and reducing bills from your CDN provider as a consequence seems to be technically feasible and does not take much effort. Further, there are many free page analysis plug-ins like YSlow or Page Speed available on the market and supported by most popular browsers like Mozilla Firefox or Google Chrome. From the statistics they generate, you can with some effort, make sure that your site obeys the rules for high performance web services at a really low cost.