What can we do to reduce a website’s bandwidth usage?
1. Switch to an external image provider. Unless your website is an all-text affair, images will always consume the lion’s share of your outgoing bandwidth. Even on this site, which is extremely minimalistic, the size of the images dwarfs the size of the text. Consider my last blog post, which is fairly typical:
|Size of post text||~4,900 bytes|
|Size of post image||~46,300 bytes|
|Size of site images||~4,600 bytes|
The text-only makes up about ten percent of the content for that post. To make a dent in our bandwidth problem, we must deal with the other ninety percent of the content– the images– first.
Ideally, we shouldn’t have to serve up any images at all: we can outsource the hosting of our images to an external website. There are a number of free or nearly-free image sharing sites on the net which make this a viable strategy:
ImageShack offers free, unlimited storage, but has a 100 MB per hour bandwidth limit for each image. This sounds like a lot but do the math: that’s 1.66 MB per minute or about 28 KB per second. And the larger your image is, the faster you’ll burn through that meager allotment. But it’s incredibly easy to use– you don’t even have to sign up– and according to their common questions page, anything goes as long as it’s not illegal.
Photobucket’s free account has a storage limit and a download bandwidth limit of 10 GB per month (that works out to a little over 14 MB per hour). Upgrading to a paid Pro accounts for $72/96/156 per year removes the bandwidth limit. I couldn’t find any relevant restrictions in their terms of service.
- Amazon S3
Amazon’s S3 service allows you to direct-link files at a cost of 15 cents per GB of storage, and 20 cents per GB transfer. It’s unlikely that would add up to more than the ~ $2 / month that seems to be the going rate for the other unlimited bandwidth plans. It has worked well for a lot of other sites.
Even though this ends up costing me $25/year, it’s still an incredible bargain. I am offloading 90% of my site’s bandwidth usage to an external host for a measly 2 dollars a month.
And as a nice ancillary benefit, I no longer need to block image bandwidth theft with URL rewriting. Images are free and open to everyone, whether it’s abuse or not. This makes life much easier for legitimate users who want to view my content in the reader of their choice.
Also, don’t forget that favicon.ico is an image, too. It’s retrieved more and more often by today’s readers and browsers. Make favicon.ico as small as possible, because it can have a surprisingly large impact on your bandwidth.
2. Turn on HTTP compression. Now that we’ve dealt with the image content, we can think about ways to save space on the remaining content– the text. This one’s a no-brainer. Enable HTTP compression on your webserver for roughly two-thirds reduction in text bandwidth. Let’s use my last post as an example again:
|Post size||63,826 bytes|
|Post size with compression||21,746 bytes|
Never serve content that isn’t HTTP compressed. It’s as close as you’ll ever get to free bandwidth in this world. If you aren’t sure that HTTP compression is enabled on your website, use this handy web-based HTTP compression tester, and be sure.
There’s a handy online CSS compressor which offers three levels of CSS compression. I used it on the main CSS file for this page, with the following results:
|original CSS size||2,299 bytes|
|after removing whitespace||1,758 bytes|
|after HTTP compression||615 bytes|
|original JS size||1232 bytes|
|after refactoring||747 bytes|
|after removing whitespace||558 bytes|
|after HTTP compression||320 bytes|
It’s possible to use similar whitespace compressors on your HTML, but I don’t recommend it. I only saw reductions in the size of about 10%, which wasn’t worth the hit to readability.
Realistically, whitespace and linefeed removal are doing work that the compression would be doing for us. We’re just adding a dab of human-assisted efficiency:
|Unoptimized CSS||2,299 bytes||671 bytes|
|Optimized CSS||1,758 bytes||615 bytes|
Although this is definitely a micro-optimization, I think it’s worthwhile since it reduces the payload of every single page on this website. But there’s a reason it’s the last item on the list, too. We’re just cleaning up a few last opportunities to squeeze every last byte over the wire.
After implementing all these changes, I’m very happy with the results. I see a considerable improvement in bandwidth usage, and my page load times have never been snappier. But, these suggestions aren’t a panacea. Even the most minimal, hyper-optimized compressed text content can saturate a 300 KB/sec link if the hits per second are coming fast enough. Still, I’m hoping these changes will let my site weather the next Digg storm with a little more dignity than it did the last one– and avoid taking out the network in the process.