Reducing Your Website’s Bandwidth Usage

What can we do to reduce a website’s bandwidth usage?

1. Switch to an external image provider. Unless your website is an all-text affair, images will always consume the lion’s share of your outgoing bandwidth. Even on this site, which is extremely minimalistic, the size of the images dwarfs the size of the text. Consider my last blog post, which is fairly typical:

Size of post text~4,900 bytes
Size of post image~46,300 bytes
Size of site images~4,600 bytes

The text-only makes up about ten percent of the content for that post. To make a dent in our bandwidth problem, we must deal with the other ninety percent of the content– the images– first.

Ideally, we shouldn’t have to serve up any images at all: we can outsource the hosting of our images to an external website. There are a number of free or nearly-free image sharing sites on the net which make this a viable strategy:

  • Imageshack
    ImageShack offers free, unlimited storage, but has a 100 MB per hour bandwidth limit for each image. This sounds like a lot but do the math: that’s 1.66 MB per minute or about 28 KB per second. And the larger your image is, the faster you’ll burn through that meager allotment. But it’s incredibly easy to use– you don’t even have to sign up– and according to their common questions page, anything goes as long as it’s not illegal.
  • Flickr
    Flickr offers a free basic account with limited upload bandwidth and limited storage. Download bandwidth is unlimited. Upgrading to a paid Pro account for $25/year removes all upload and storage restrictions. However, Flickr’s terms of use warn that “professional or corporate uses of Flickr are prohibited”, and all external images require a link back to Flickr.
  • Photobucket
    Photobucket’s free account has a storage limit and a download bandwidth limit of 10 GB per month (that works out to a little over 14 MB per hour). Upgrading to a paid Pro accounts for $72/96/156 per year removes the bandwidth limit. I couldn’t find any relevant restrictions in their terms of service.
  • Amazon S3
    Amazon’s S3 service allows you to direct-link files at a cost of 15 cents per GB of storage, and 20 cents per GB transfer. It’s unlikely that would add up to more than the ~ $2 / month that seems to be the going rate for the other unlimited bandwidth plans. It has worked well for a lot of other sites.

I like ImageShack a lot, but it’s unsuitable for any kind of load, due to the hard-coded bandwidth limit. Photobucket offers the most favourable terms, but Flickr has a better, more mature toolset. Unfortunately, the terms of use restrictions at Flickr prohibit commercial use. Amazon S3 may be the best long-term choice; as many, if not all of these photo-sharing services are blocked in corporate firewalls.

Even though this ends up costing me $25/year, it’s still an incredible bargain. I am offloading 90% of my site’s bandwidth usage to an external host for a measly 2 dollars a month.

And as a nice ancillary benefit, I no longer need to block image bandwidth theft with URL rewriting. Images are free and open to everyone, whether it’s abuse or not. This makes life much easier for legitimate users who want to view my content in the reader of their choice.

Also, don’t forget that favicon.ico is an image, too. It’s retrieved more and more often by today’s readers and browsers. Make favicon.ico as small as possible, because it can have a surprisingly large impact on your bandwidth.

2. Turn on HTTP compression. Now that we’ve dealt with the image content, we can think about ways to save space on the remaining content– the text. This one’s a no-brainer. Enable HTTP compression on your webserver for roughly two-thirds reduction in text bandwidth. Let’s use my last post as an example again:

Post size63,826 bytes
Post size with compression21,746 bytes

We get a 66% reduction in file size for every bit of text served up on our web site– including all the JavaScript, HTML, and CSS– by simply flipping a switch on our web server. The benefits of HTTP compression are so obvious it hurts. It’s reasonably straightforward to set up.

Never serve content that isn’t HTTP compressed. It’s as close as you’ll ever get to free bandwidth in this world. If you aren’t sure that HTTP compression is enabled on your website, use this handy web-based HTTP compression tester, and be sure.

3. Optimize the size of your JavaScript and CSS The only thing left for us to do now is reduce the size of our text content, with a special emphasis on the elements that are common to every page on our website. CSS and JavaScript resources are a good place to start, but the same techniques can apply to your HTML as well.

There’s a handy online CSS compressor which offers three levels of CSS compression. I used it on the main CSS file for this page, with the following results:

original CSS size2,299 bytes
after removing whitespace1,758 bytes
after HTTP compression615 bytes

We can do something similar to the JavaScript with this online JavaScript compressor. But before I put the JavaScript through the compressor, I went through and refactored it, using shorter variables and eliminating some redundant and obsolete code.

original JS size1232 bytes
after refactoring747 bytes
after removing whitespace558 bytes
after HTTP compression320 bytes

It’s possible to use similar whitespace compressors on your HTML, but I don’t recommend it. I only saw reductions in the size of about 10%, which wasn’t worth the hit to readability.

Realistically, whitespace and linefeed removal are doing work that the compression would be doing for us. We’re just adding a dab of human-assisted efficiency:

Unoptimized CSS2,299 bytes671 bytes
Optimized CSS1,758 bytes615 bytes

It’s only about a 10 percent savings once you factor in HTTP compression. The tradeoff is that CSS or JavaScript lacking whitespace and linefeeds has to be pasted into an editor to be effectively edited. I use Visual Studio 2019, which automatically “rehydrates” the code with proper whitespace and linefeeds when I issue the autoformat command.

Although this is definitely a micro-optimization, I think it’s worthwhile since it reduces the payload of every single page on this website. But there’s a reason it’s the last item on the list, too. We’re just cleaning up a few last opportunities to squeeze every last byte over the wire.

After implementing all these changes, I’m very happy with the results. I see a considerable improvement in bandwidth usage, and my page load times have never been snappier. But, these suggestions aren’t a panacea. Even the most minimal, hyper-optimized compressed text content can saturate a 300 KB/sec link if the hits per second are coming fast enough. Still, I’m hoping these changes will let my site weather the next Digg storm with a little more dignity than it did the last one– and avoid taking out the network in the process.