Easy ways to save bandwidth

After reading Jeff Atwood’s terrific post about saving bandwidth on web sites I’ve moved the Geekrant RSS feeds over to Feedburner, using Steve Smith’s mavellous WordPress Feedburner plugin, which works in WP 2.0x and 1.5x.

I also turned on HTTP compression, which in WordPress is as easy as clicking a checkbox. It not only saves you bandwidth, but users get your pages served quicker, since the bottleneck is bound to be their bandwidth, not their browser’s ability to decompress.

We’ll see how it goes. Bandwidth has been growing recently: January 2.8Gb; February 2.7Gb; March 3.4Gb. It’s not at ludicrous levels, but if it keeps climbing, I’ll end up paying more for the hosting. Hopefully this will help bring it back down.

Update 8:40pm. First thing I notice is that when reading the feed from within the Feedburner site, it doesn’t treat relative paths to images properly. I guess I’ll have to put absolute paths, ‘cos at the moment in the previous post it’s trying to load http://feeds.feedburner.com/files/2007/mediagate-mg35.jpg instead of http://www.geekrant.org/files/2007/mediagate-mg35.jpg. I wonder how it treats relative links?

If you enjoyed this post, please consider leaving a comment or subscribing to the RSS feed to have future articles delivered to your feed reader.

2 thoughts on “Easy ways to save bandwidth

  1. Chris Till

    When DVD Plaza hit the 300,000+ hits a day mark years ago I had to do a TOTAL rethink on just how to deliver my content during a ~2 redevelopment from the ground up (I was rewriting my entire code base, designing an entirely brand new GUI, buying new servers, moving to a new data centre, etc).

    From memory I did a few pretty big things:
    1. Designed every aspect of the new GUI to be completely usable throughout every little section of the site – everything from dialogs to titles to buttons where all made absolutely strictly consistent so that I simply reuse existing content.

    2. Moved to an entirely XHTML/CSS approach, so that all this formatting/layouts/etc was stored in a CSS.

    3. Rewrote all my JavaScript into an extremely cut-down and single .js file format.

    4. Created “arrays” of images – despite probably a few hundred images of logos, buttons, etc I created like 4 graphics for the entire site and within those graphics are every graphic for my entire site. Those 4 or so graphics are optimised for different areas – eg everything you need for the front page might be contained in 2 images, moving beyond the front page needs a 3rd image, and then beyond that a 4th…

    5. Configured my code to transmit very, and I mean VERY, tight rules on exactly when content is to be fetched. Every graphic, CSS, JS, etc is set with rules that not only say when the item expires (like a whole year away or something) but a rule that says “don’t even bother to check if the file has changed for the next year”

    6. Although it was an utter nightmare to make work properly, since it’s technically not even a valid thing to do, I did ZLIB compression on the JavaScript, CSS, and RSS I transmit in addition to the PHP pages.

    7. Since my site is 100% pure PHP driven content, I had no reason to bother about XHTML readability – my PHP code is fully readable but I don’t care how readable the XHTML is, so there’s sweet bugger all white space in that too.

    8. Every single link, image, whatever is references by a relative path, there is no use of the domain anywhere.

    9. etc. etc. etc. etc

    The end result for me is that users are requesting very few files from me, those files are heavily compressed/optimised, they don’t every query me for updated files, they don’t send me DNS queries for everything, etc saving me something like 80% of my costs at the time… it was a massive saving.

    The end result for the user was a site that, even in our broadband era, is insanely fast for even dialup users due to very small data and very few files.

    One complication that I had to allow for in all my new code, however, was that any time I change my CSS, JS, images, etc my code has to automatically use a new filename for it since the user’s webbrowsers are under strict instruction to never even bother checking if the file has updated and thus will never get the new content… so everything has a version number associated with it, eg PlazaGUI61d.css, or PlazaIMG4f.jpeg, etc… mind you, if I recall correctly, one of the things I had to do to force css compression to work since you’re not meant to do that is the css is actually served from an alternate extension… eg it would be PlazaGUI613d.dvd for example…

    This is all off the top of my head though, and I couldn’t even describe what a massive and complicated rethink all this was, but it paid off in ways that showed in hard cold cash.

    In hindsight I never have checked up the impact of our RSS feed, but again I even compress that and our bandwidth has remained totally under control for years with these approaches so haven’t had reason to worry about it…

Comments are closed.