Reducing HTTP Requests

Published in Programming and Scripts on Tuesday, May 30th, 2006

The idea of reducing HTTP requests to optimize page download speed is something that I have discussed a few times on Fiftyfoureleven. Recenlty a couple of articles have been published on the topic, and I thought I'd revisit some of the ideas presented here.

HTTP requests and data packets

First, a quick look at just what we are talking about. Most of this is taken from CSS Sprites, Background Images and Website Optimization.

Packet size and http requests

From Web Page Design and Download Time, by Jing Zhi of Keynote Systems (seen here - pdf), cited in Speed Up Your Site: Web Site Optimization:

The basic performance principle is therefore to make fewer requests and transmit fewer packets. From this principle, we can derive two basic design rules for well performing Web pages. First, reduce the overall size of the page, thereby reducing the number of bytes (and packets) to be transferred over the Internet. Second, limit the number of embedded objects on the page, such as images, each of which must be requested and transferred separately from server to browser.

They also found that it was the number of packets and not necessarily the overall size of the page that was important. If a packet could hold 1460 bytes (the figure given in the article) and your object was 1600 bytes, it would require two packets. They found that this object would transfer at the same speed as another object that was greater in size but still fit in two packets.

What this means is that the fewer requests that you make for a given page, and the more effectively they fit in a data packet, the faster your page would be.

Now, data packet size depends (correct me if I'm wrong) on the connection type - adsl, t1 etc - and even if it were consistent for every user, optimizing to fit packets perfectly might be just a bit too, umm, obsessive. So reducing http requests should be good enough.

How to reduce HTTP requests, then?

Well, there have been two posts written recently that cover the topic:

Serving JavaScript Fast, by Cal Henderson, discusses the possibilities and pitfalls of delivering Javascript (and CSS) by suturing files together, and Reducing HTTP Requests: An idea for a plugin, by Tim Lucas, discusses the building of a Rails plugin to do what Cal has suggested.

For more info on these ideas, I thought I'd hilite some old posts here on Fiftyfoureleven that deal with the topic.

Suturing files

With respect to putting files together and reducing several requests into one request, check out Stitch'em together over on CSS + PHP: Organized and Optimized? and Suturing your stylesheets over on Applied CSS Management and Optimization. Both of these discuss the idea based on putting multiple CSS files together into one file.

Suturing images

Another idea that I have not been applying myself of late, is the idea presented in CSS Sprites, Background Images and Website Optimization. The short of it from that article is that if you combine images that are used on a website into one image file, you reduce the number of HTTP requests and speed up page rendering time.

This same idea is visited in Responsible CSS - Recycle your background images, which looks at reusing background images (or any image in use on your site) to pretty up other portions of your website. The example presented in that article (which is embarrassingly out of date) is to use a background tile to pretty up a blockquote. In that manner, you are recycling a background image, and prettying up your site without adding another file request.

Both of those ideas do require a bit of strategizing to get it right, but if you are building a site that will be around for awhile, it may pay in the end to do this. I suppose that the gains in bandwidth will add up on high traffic sites as well.

Anyways, that's about it. Funny, I was thinking about this topic just the other day, as I looked at the source for the admin of our CMS and saw no less than 6 javascript files were now being used (Javascript creep, anyone?).

I consoled myself with the fact that at least the page was using Lightweight CSS Tabs, but after writing this article, I think its time to suture together the dependant javascript files.

Note: Worth checking out The State of Browser Caching if you plan on playing with all of this!

Comments and Feedback

The stitch together CSS with PHP is very intersting, keeping them separate for management but outputting them together. And I employ the multiple images on one technique, which is very useful.

I've used combining images for mouse rollovers (not for the packet reduction benefits but just because it made the CSS a bit easier). Using one master images for a whole site seems promising but might be a pain to manage.

And of course, the whole thing goes out the window when caching is disabled in the browser, but I guess only us web developers with our fancy web developer toolbars need to feel the pain.

This is also a great argument for putting CSS and JavaScript into the head of a page whenever it's unique to that page.

Using one master images for a whole site seems promising but might be a pain to manage.

Yeah, it can be tough. Obviously it could be broken down into various "compound" images, and that's where the strategy comes in. uses AJAX heavily and reduces HTTP responses in a few interesting ways:

(1) TagTooga is a Wiki web directory (integrating more than just links) so each detail page is a listing of resources. The data returned by an AJAX request is not pure HTML. Instead it is a series of "command1|data1|command2|data2|..." where the command might be an HTML DIV id, or a keyword such as "alert" or "eval". Anyway, to cut to the chase: the response is processed by the client-side Javascript, so rather than sending redundant HTML (strings that commonly occur many times) you can instead send smaller "codes" and let the Javascript expand it. (Like huffman encoding).
(2) Another good way to reduce traffic to/from the server is to eliminate HTTP requests where possible. For example, on TagTooga when the "Add URL" button is clicked, no request is sent to the server. Instead the HTML for the Add URL form is embedded directly in the Javascript. (I don't see the site's Javascript and CSS file sizes as being an issue because once they are loaded, they are cached by the browser.)

Some food for thought...

One of the best ways to reduce the number of packets for large files, and by that I'm referring to text (markup/css/javascript) is to gzip compress them on the server-side. Only the oldest browsers are not designed to handle such requests. The only drawback is you pay a CPU penalty at the server, but there's always a penalty to pay one way or the other.

Good point about the GZip compression. I'm watching the balance between bandwidth and CPU usage to see if/when that would be of benefit.

Another thing we do at TagTooga: Our servers automatically generate thumbnails of each URL added to the system. The thumbnails however are stored on an entirely different server, and even hosted by a different hosting company. The thumbnails you'll see on TagTooga are served from "" which only exists to serve up images. I like this solution for future scalability. Once hits its limits, just add, then, etc.

Home » Blog » Web Development » Programming and Scripts

Check out the blog categories for older content

The latest from my personal website,

SiteUptime Web Site Monitoring Service

Sitepoint's web devlopment books have helped me out on many occasions both for finding a quick solution to a problem but also to level out my knowlegde in weaker areas (JavaScript, I'm looking at you!). I am recommending the following titles from my bookshelf:

The Principles Of Successful Freelancing

I started freelancing by diving in head first and getting on with it. Many years and a lot of experience later I was still able to take away some gems from this book, and there are plenty I wish I had thought of beforehand. If you are new to freelancing and have a lot of questions (or maybe don't know what questions to ask!) do yourself a favor and at least check out the sample chapters.

The Art & Science Of JavaScript

The author line-up for this book says it all. 7 excellent developers show you how to get your JavaScript coding up to speed with 7 chapters of great theory, code and examples. Metaprogramming with JavaScript (chapter 5 from Dan Webb) really helped me iron out some things I was missing about JavaScript. That said each chapter really helped me to develop my JavaScript skills beyond simple Ajax calls and html insertion with libs like JQuery.

The PHP Anthology: 101 Essential Tips, Tricks & Hacks

Like the other books listed here, this provides a great reference for the PHP developer looking to have the right answers from the right people at their fingertips. I tend to pull this off the shelf when I need to delve into new territory and usually find a workable solution to keep development moving. This only needs to happen once and you recoup the price of the book in time saved from having to develop the solution or find the right pattern for getting the job done..