Archive for the ‘optimize’ Category

I’ve always known a few things that I should do to make my website go faster, (turn on expires, use multiple domains, geoip, ..etc) but nothing made it as clear as that gave a pictorial view of what really happens when you load a page in IE (it uses an IE plugin to measure when things really happen).  Anyway to cut a long story short, I got the page load times for my site down to 2-3 seconds instead of 6-8 seconds.

Step 1. Turn on Keep-Alives.

 Surprisingly, this is off by default in apache, in httpd.conf set:

KeepAlive On

Step 2. Turn on compression.

You’ll need mod_deflate, but this is included by default.

In your VirtualHost config (assuming you are using that)

<VirtualHost *:80>
AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css text/javascript

Step 3. Add Expires headers.

For me, I very rarely change my images, but I do occasionally change my javascript. So I set the expires for my images to 1 month, and js/css to 1day. What I probably should do is include a version number in my css and js, and also have them expire in 1 month.

<VirtualHost *:80>
    ExpiresActive on
    ExpiresByType image/gif "access plus 1 month"
    ExpiresByType image/jpg "access plus 1 month"
    ExpiresByType image/png "access plus 1 month"
    ExpiresByType image/ico "access plus 1 month"
    ExpiresByType text/javascript "access plus 1 day"
    ExpiresByType text/css "access plus 1 day"

Step 4. Move your images/css off to a different domain.

IE 7 (and other browsers) will only make 2 connections to any given domain name at a time (more recent browsers will make more). Perhaps the easiest is to make one for your images, one your for css and/or javascript. If you use cookies for your domain, and your images are on your domain, then the cookies will be sent with every image request. So what starts to happen is that the user is slowed down making the request for the image. Many users (most) have slower upload than download, and your cookies might be as large as 1k, and that can have a fairly big impact on how fast your site is viewed.

You should trade off the number of domains with the cost of a dns lookup. In the US you can expect a DNS lookup to take 20-120ms. In Australia, it’s more likely to be around 200ms for a US site.

You can just make simple aliases with apache using something like:

<VirtualHost *:80>

If you run a larger site with lots of images, you probably generate your site (well you should). A great way to get your images to be across multiple domains, and still get cached on different pages is to do a hash of the filename (or contents – slower), to generate the hostname to use. ie: “i”+(((hash(filename)%4)+1)+””. I use java and jsp (and C), so I used the following snippets (watch out this will explode for large filenames).

        public int fileHash(String uri) {
                int len = uri.length();
                int ret = 0;
                for (int i=len;--i>=0;)
                        ret = ret * 7 + uri.charAt(i);
                ret = ((ret >> 4) & 0x03)+1;
                return ret;


and in java:

        public static int fileHash(String uri) {
                int len = uri.length();
                int ret = 0;
                for (int i=len;--i>=0;)
                        ret = ret * 7 + uri.charAt(i);
                return ((ret >> 4) & 0x03)+1;

Step 5. Insert small js and css files directly into the page.

A lot of time is spent just connecting to a site (and very little time downloading for small files), so save the extra connection and include small css and js items right on the page. Also, if they are specific to that page, just include them. The advantage to having them separate is that on subsequent pages they will already be loaded, but if they are small or never used again, it’s pointless. Include css files like this:

<style type="text/css"> {

And for Javascript:

<script type="text/javascript"><!--
(these days, you can probably not bother with the &lt;!-- and //--&gt;

Step 6. Put large repeated javascript and css into a seperate file.

Converse to Step 5, if you have repeated javascript, you should externalize them.

Step 7. Merge javascript files together. Merge css files together.

Most of the time you can do this by just making one file with all the contents in it. I’m currently tracking down a problem where two javascript files didn’t work nicely when placed in the same file.

Step 8. Use a CDN

If your site is large enough, use a CDN (google it). I’m using MaxCDN. Hard to beat $10 for 1TB, and even normally it’s $100/TB. They don’t have a presence in some countries where I do, so I hack up the IP address using my vdns so that I usually use them, but for some countries, I point to my own servers (that is a much longer story). I’ve only just started using them, but so far so good.

Step 9. Make sure webpages/images are not heavy

Sometimes you can use javascript to generate the html to reduce the size of the page. Less is more, so if you have too much stuff on your page, consider trying to simplify. It doesn’t work for all sites, but it works for most.

For images consider that you really don’t need that 24bit png, and an 8bit one would do. What I have been doing for the larger images loading the image in the Gimp (of course), compressing to say 256, 128, or 64 colours and see if I can notice a difference compared to the original image when I am zoomed in at 200%. In GIMP right click -> Image -> Indexed. Select the number of colours, and then click Ok. Then use Ctrl-Z, Ctrl-Y, Ctrl-Z, .. etc to see if you notice any difference.

Step 10. Put your ads in an iframe.

Actually, this should be like step 1, since even though you think ads run on fast servers, some analysis will quickly show you that 1. They are heavy, and 2. they are slow (even google). If your ad provider says they don’t support it, get someone else. It makes a huge difference in the loading time on your page. I’ve seen many say they don’t support it, but I tried it, and it worked just fine.

Step 11. Minimize your js (and css)

Remove unnecessary comments, and use minify tool to compress your javascript. There are a number of tools around to reduce the size. Check google for some (I didn’t minize my code yet).

Step 12. Turn off e-tags

It’s just a waste of space. Google for more info.

<VirtualHost *:80>
    FileETag none

I got my content loaded in 1.7 seconds rather than 1.8 seconds without the etags 🙂 [of course this was not tested very well]. Certainly in this case less is more!

Other notes

  • Don’t use redirects – these will cause another page hit and more round trips. Instead you should configure apache to just load the page you want.
  • Put your css files at the top, so that rendering can start. Similarly, if you can, put javascripts at the bottom.
  • Have a small favicon.ico. Better to have one than not (else you get 404, which costs time), and don’t forget to make sure it has expires header.
  • Compress multiple images into one. If you are using css, you can put all your images in the one image and just select the bits that you want. You can play around with the way you arrange the images, but generally having the images across the page will make them compress better.
  • Include width and height tags. This means when the site renders it won’t be jumping around, and the user can just click on what they want. And of course the width/height should match the actual image. It annoys me to see large images scaled down to a small one, which sucks up bandwidth, slows down the image view, and generally is not a good user experience.
  • Consider what browsers your users are using. Browsers like ie6/7 (and earlier versions of firefox) only allow two connections per domain, but newer browsers will allow more connections, and so the balance tips away from multiple servers because of the extra cost for DNS lookups. HTTP Pipelining seems to be a great feature to me, and I’m surprised that it’s not turned on by default for more browsers. It’s supported in most non-ie browsers, but usually disabled by default (except opera, which uses some heuristics to turn it on/off). Poor support from proxies seems to be the reason it is not more widely adopted.


Although I haven’t finished optimizing my site, the changes I made have dramatically improved the page load time (and therefor the user experience). Note that in these graphs I already have ads in an iframe before and after, and I already have the images running off a different web server.

Before the optimizations:

After the optimizations: You can see that around 1.8 seconds all of my files have finished loading, and it’s only ad files that are still loading, and this has little impact to the user, since the ads are in an iframe.

Other considerations

You should also consider your situation as to how to optimize your site. For example if you are a scammer site, just there for the google keywords, and don’t get any repeat traffic and very low page views per user, then just whack everything onto the one page, you might as well get it all downloaded straight away, and no point in waiting for the secondary connections.

If on the other hand, you are facebook and get 100 page views per user for a visit, and they come back regularly, then you might consider having a lot of content written out by a javascript file that never changes so that you don’t need to reload that content each time. A good example for my site is the menu systems. I currently have a large div with all the contents of the menus, but if I changed that it a javascript that wrote out the contents, then I would only need to load that javascript once, while the user goes to all the pages (and I get higher than average page views / visit).