Skip to main content Skip to secondary navigation Accessibility Feedback

How to build a high performance website

This is part 2 of Wicked Fast Websites, a three-part series that explores why web performance matters and what you can do about it.

We’re in the middle of a perfect storm. Websites are larger, devices are more varied and less predictable, and performance expectations are higher than ever. Today, you’ll learn some simple tools and techniques you can use to build high performance websites.

An aside: Today’s websites are five times bigger than in 2009. In fact, they’ve gotten 20-percent bigger this year alone. One of the easiest ways to improve the performance of your websites is to put them on a diet.

Markup Order Matters #

When a browser accesses a webpage, it immediately begins reading and rendering the content. When it comes across external files likes images and videos, it begins downloading them, two at a time. This is really useful, because if you have a really large file, it doesn’t hold up the other files from being downloaded.

There are two exceptions to this process:

  1. CSS stops a browser from rendering content. Repaints are bad for performance, so the browser waits until it’s finished downloading the styles before rendering any additional content.
  2. JavaScript stops all other downloads. Because JS often manipulates objects on the page, the browser doesn’t want to download them until it knows exactly what the JS is going to do.

Put your CSS at the top of your page, up in the <head> element, to avoid repaints. Similarly, put your JavaScript files down in the footer of your site to maximize concurrent downloads. The exception to this is feature detection and polyfill scripts (like Modernizr or an HTML shim), which should go in the <head> because your stylesheet often relies on them.

The order of styles and scripts in your markup doesn’t make the page download content any faster, but it does help browsers start displaying it more quickly, and therefore appear faster.

Combine Similar Files #

One of the biggest bottlenecks in page load time is in downloading the actual files for your site. Each HTTP request adds additional load time. How much time? According to Google:

Every time a client sends an HTTP request, it has to send all associated cookies that have been set for that domain and path along with it. Most users have asymmetric Internet connections: upload-to-download bandwidth ratios are commonly in the range of 1:4 to 1:20. This means that a 500-byte HTTP header request could take the equivalent time to upload as 10 KB of HTTP response data takes to download. The factor is actually even higher because HTTP request headers are sent uncompressed. In other words, for requests for small objects (say, less than 10 KB, the typical size of a compressed image), the data sent in a request header can account for the majority of the response time.

It’s actually faster for a browser to download one 300kb file than it is to download three 100kb files. By combining similar file types together—a process known as concatenation—you can improve page performance.

When you can, combine all of your JavaScript into a single scripts.js file. Rather than loading separate CSS files for your base styles, small screens, bigger screens and so on, combine them all into a single stylesheet with media queries.

And don’t think you can cheat by using the @import rule. That still requires additional HTTP requests.

Remove the Whitespace #

Minification is the process of removing spaces, line breaks and comments from your CSS, HTML, and JavaScript. Though it might not seem like a big deal, removing all those unused elements can decrease the size of your files by 40-percent or more.

One useful minification tool is Google PageSpeed Insights, a browser extension for Chrome and Firefox.

Minify Your CSS, JavaScript, & HTML #

In your browser, open up Developer Tools and click on the “Page Speed” tab. Then click “Analyze.” You’ll be given a list of things you can do to improve your site performance.

One of the items on the list will be “Minify CSS.” Click it. Under “Suggestions for this page” is a link to “see optimized content.” Follow that to get a minified version of your CSS provided by Google.

The result is a tiny but rather unreadable stylesheet. Rather than overwriting your human-readable CSS, paste the minified code into a new file called style.min.css and reference that in the header of your HTML. If you ever want to make updates, is as simple as removing the .min and reminifying when you’re done.

PageSpeed Insights provides similar links for your JavaScript and markup as well. Like with my CSS, I’ll keep a human-readable scripts.js file, and put my minified code in scripts.min.js. Minifying your markup gets a bit trickier, and results in less reduction in file weight, so you may choose not to follow that step.

Minify jQuery #

If you’re using jQuery on your site, Google provides a hosted and minified version that’s 34-percent of the original size. An additional benefit of using the Google-hosted version of that a lot of developers use this technique, so there’s a good chance your visitor has jQuery cached in the browser already and doesn’t need to download it at all.

HTML5 Boilerplate uses a smart implementation of this that provides a local fallback if the Google CDN is unavailable:

<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.10.2.min.js"><\/script>')</script>

You’ll notice that the http: is missing from the the URL. That helps avoid errors associated with encrypted domains.

Smarter Image Formats #

Different image formats work better for different types of graphics.

PNG is a lossless image format, so it keeps graphics sharp and crisp (as opposed to the lossy JPG format). For icons and simple images with clean lines, they can actually be more lightweight than JPGs. But for photos and images with lots of visual noise, JPGs will be much smaller in size with comparable quality.

JPG Formats #

The JPG actually has multiple formats. The most common on the web is the baseline JPG. Baseline JPGs start rendering at the top, and build down as they go.

An example of baseline versus progressive JPG rendering

Photo by Tony Luong

An alternative format that was popular a decade ago and is seeing a bit of a comeback is the progressive JPG. Progressive JPGs build in layers. Initially, the full image in low resolution is displayed, and as the image renders, it becomes increasingly crisp and clear.

While progressive JPGs are typically a little smaller than baseline, their real advantage is that they appear faster to the user because they display more content faster. And on smaller screens, the lack of clarity on initial renders may not even be as noticeable.

A chart showing progressive JPG support

Source: Performance Calendar

While all browsers display progressive JPGs, some browsers do a better job than others. For “non-supporting” browsers, the entire progressive JPG needs to download before it can be displayed, resulting in a worse experience than a baseline JPG.

Compress Your JPGs #

Photos can add a lot of weight. A high-quality photo can weigh as much as 700kb or more. By compressing photos, you can reduce them down to less 100kb while maintaining image quality.

A JPG compression rate of 70 is considered high-quality for the web.

Smush Your Images #

The metadata that photographs include—timestamps, color profiles, and such—can add quite a bit of weight. Smushing is the process of removing that metadata, and it can reduce the size of an image by more than 25-percent.

If you’re a Mac user, ImageOptim will smush your images without degrading the quality. It works for PNGs, JPGs, and GIFs. Windows lacks a clear one-for-one counterpart, though their are a handful of products that work well for different image types. If you’re a Windows user, check out b64.io, a web-based, drag-and-drop optimizer that seems to work just as well as ImageOptim (hat tip to Chris Coyier).

Icon Fonts #

Icon fonts take advantage of the CSS3 @font-face rule, and allow you to embed a font (kind of like webdings) on your site that contains all of your icons.

They offer a few advantages over image-based icons:

  • They’re lightweight.
  • All of your icons are in a single file.
  • They’re styleable with CSS.
  • Because they’re a font, they’re infinitely scalable, and look crisp on both regular and high-density displays.
  • They’re compatible all the way back to IE 5 (seriously).

There are two small considerations:

  1. Windows Phone 7 (running IE 9) lacks true @font-face support.
  2. Icons can only be one color.

The free IcoMoon app allows you to pick just the icons you need and even upload your own. There’s a lot to learn about icon fonts, so if you’re interested, check out my start-to-finish tutorial on using them.

Image Sprites #

If you still need to serve up small sets of images, you should consider using image sprites. Rather than using multiple image files, you can combine all of your images into a single file and embed it using the background-image property, resulting in fewer HTTP requests.

Image sprites are useful if you need multi-color icons or Windows Phone 7 support, though their are some challenges in using them with high-density displays. They’re also a bit harder to maintain should you decide to add or remove an icon. If you’d like to use image sprites, the CSS Sprite Generator makes things a bit easier.

Adaptive Images #

The same image rendered on multiple devices of varying sizes

Web images need to look good on everything from low-powered phones and watches to big-screen TVs and high-density displays. But why should a smartphone get the same image as a big, high-density monitor?

Adaptive images are an approach to this challenge. By detecting the size of the display (and ideally bandwidth constraints), you can serve the right image size for the device. Unfortunately, there’s no great way to do this today, though there are a lot of people who are working on it.

Many of today’s workarounds use JavaScript image-replacement. These scripts often load after images have already been downloaded, however, so the image gets downloaded twice, which is worse for performance than not doing anything at all. Matt Wilcox has created a PHP-based solution that intercepts the image request on the server. It works really well, but is a bit complicated to set up.

I believe the most promising solution lies with a standards-based solution (the W3C has several in the works) that let’s the web developer offer the same images in multiple sizes, and let’s the browser decide which one best fits the user’s current needs.

Compress Your Site #

Your server can actually compress your website files—a process known as gzipping—before sending them to the browser. This results in about a 70-percent reduction in website size.

On Apache servers, you can enable gzipping with a simple modification to your .htaccess file. Learn how in this tutorial on GitHub.

Some web hosts use a slightly different method to implement gzipping. You can check if it’s working on your site using gzipWTF.

Set Expire Headers #

Expire headers tell browsers to keep static assets stored locally so that a visitor’s browser doesn’t have to re-download them every time they visit your site.

This is also something that’s done using the .htaccess file. To set expire headers, follow these instructions on GitHub.

In Summary #

  1. Markup order matters.
  2. Combine similar files.
  3. Remove the whitespace.
  4. Use smarter image formats.
  5. Compress your JPGs.
  6. Smush your images.
  7. Use icon fonts and image sprites.
  8. Consider adding adaptive images.
  9. Compress your site.
  10. Set expire headers.

These techniques can be implemented in about an hour, and make a big difference on site performance. You can test your site performance using the Pingdom Website Speed Test.

Have any questions or comments about this post? Email me at chris@gomakethings.com or contact me on Twitter at @ChrisFerdinandi.

Get Weekly Digests