Select Page

Why do some pages blink into place like they have a hyperdrive installed; and others dawdle their way into existence? The former likely have some simple tricks figured out. Those tricks are actually easy to figure out and the benefits are broad. They improve the user experience and they make search engine optimization easier. A couple of these tips may be a little more server intensive (like Keep-Alive), but some of them can actually make your site easier to deliver.

 

Keep-Alive

Start this on the server side. HTTP, the defacto protocol used for web traffic, is stateless. It lives in a state of perpetual amnesia. With each web request your client forgets where it found the resource and looks again. This amnesia is good in that it’s doesn’t memorize a path between the server and the client, but it finds the best available path each time. Opening a connection, finding a source and getting that response is expensive in terms of time– it uses up a lot of client time waiting and searching. Keep-Alive keeps the connection alive (simple, huh?). Instead of opening and shutting the route, it opens it, does the file moving via the same route for multiple files, and then closes the connection after all of the files are down.

If you want a snappy reaction, your server has to be set-up with Keep-Alive “On.” It means that the server has to keep the connection available and be open much longer. That means a busy site can get slammed from the concurrent number of open threads. One site I know that has modulitis (see below); it can deliver pages quickly but sometimes it overloads and the server crashes. Keep-Alive is a blessing and a curse.

Why aren’t all servers set to have this on? It’s more processing intensive, so hosts that run many sites off of one server will turn this off. It’s more economical to pay for more bandwidth for open and closed connections than it is to pay for more servers to house fewer sites capable of managing traffic peaks.

You can min-max how this works if you get into a multi-server envirnoment. Static files come via a Keep-Alive because plain files are simple. Then dynamic files (dynamic web pages, dynamically built images, etc.) come via a non-Keep-Alive server to keep the interaction transient.

If you’re looking save money at the expense of a snappy user experience keep Keep-Alive off. If delivery and the quality of the user experience is paramount: turn it on.

 

Flush Your Output

Hang time is a killer when you’re waiting for a web page. Maybe that website has gone missing and you’re just sitting there like a schmoe. How is a user to know? Some content management systems (cough– Drupal) allow for a crazy amount of hang time before the page delivery begins. By default, PHP can wait for a good length of time before delivery but you can urge it to begin earlier than otherwise by executing a flush() function call to push what is ready to deliver to the output. I do this by putting the flush() call at the top of the page.tpl.php file. If you’re about to output the themed data, you’re ready to deliver content, so I say push it out as soon as you can, even starting the process before the page.tpl.php is populated and served. One thing to note: flushing content may not be a tactic that plays well with GZip.

 

Fewer File Downloads

Way back in the day, an HTML file was all you got, then images and Javascript came followed by CSS files to control the look of the site. A web page isn’t so much a page of content as it is an assault of files barraging your web browser. These supporting files come down in pairs (some newer browsers suck down four threads at a time). While some files are being downloaded, other are waiting to come down and you get this big queue of files. This weakness shows off in a massive way with Drupal: each module is incorporated into the code and each module brings one or more Javascript and stylesheets to the party. If you’re suffering from modulitis (too many cool modules in play), your page may be paralyzed waiting for these supporting files. In broad strokes: look for any way to optimize your page downloads– minimize the number of files and their size. Where possible looks for ways to download fewer files and look for ways to cache content locally (CSS files, Javascript, and images) by making it consistent and easy to re-use.

 

Aggregate CSS and Javascript

You should collapse all of your Javascript and CSS into one file each (all CSS in one; all Javascript in the other). While Drupal does ask for a lot supporting files, it compensates by allowing for the aggregation of Javascript and CSS into collected files. In extreme cases, this knocks down the number of files from 30+ files down to two to four files. If you’re using a CMS make sure it can aggregate these supporting files. If you’re doing a manual web design, make sure you follow this rule as closely as possible.

Keep in mind that if you have lots of plug-ins and widgets on your site that you a) cannot aggregate those; and b) you have to wait for those you do come down before your page is considered “downloaded”– that’s an eternity and you may question whether you want a super slow but tricked out site, but sometime that’s a little lean and mean.

 

GZip

Web clients run on super powered machines. The average web browser runs on a desktop that has 60 times the processing of what was possible in the 1990s. There’s a lot of processing available that you, as a web developer, can lean on. Your content can be compressed if the web client– the recipient– is capable of receiving compressed content and can decompress it. Browsers and server are pretty good at handling this complex interaction. A server effectively asks, “can you accept zipped files?”, the browser client will affirm, the server will ship them shrunk and optimized without anything looking different on the surface. Under the hood, there are a couple ways to make this functionality possible on your server. Apache has a module called “mod_deflate” to compress files as they come out. Many CPanel installs come with the “Optimize Website” with does the same– it activates GZip for your site. When the files are smaller, they arrive faster because they use less bandwidth. Fast is good. Note: as per above, GZip and Flush may be a problematic combo, but I encourage you to try– ideally: flush PHP pages without a GZip but try to GZip static elements like images, Javascript and CSS.

 

CSS Sprites

CSS sprites use portions of a larger image to fulfill some graphical need on your web page. Spriting isn’t a new concept– I probably built my first sprite for a video game over 25 years ago. But its role in web design is comparatively new.

In pursuit of fewer file downloads, you can lump multiple elements into a single image and then use CSS to slice that image for use. There is a lot of finesse to how you slice up an image with CSS. You need to pay attention to how the image will be used and you need to be comfortable using CSS backgrounds with cropping and the repeat concepts figured out. We have more information on how CSS sprites work, here.

 

Design Decisions

There are a number of design decisions that won’t compromise your final product but will improve it will delivery.

Embrace CSS3 – Rounded corners, shadows, angled text: those are all available with CSS3. If you’re worried about the pain of picking up CSS3 or getting it to be cross browser compatible, have no fear, http://css3generator.com/ is here. You would be surprised with what you can accomplish. CSS3 is more flexible, it relies on styling controls in lieu of images. Look for all of the ways you can replace images with CSS: shadows on text and div tags; tilted text; multiple columns, gradients, et cetera. Cooler still– text in its raw form can be indexed easily by search engines. The same effect, if done with images, would need alt and title tags. And, it would take longer to deliver, and images are less flexible. And. And. And. Give CSS3 a try.

Consider inline images – Something cool: you can make a text reference in lieu of calling an image file, call the data for that image. Images are files of characters. Instead of calling the file, you can load through the data. Here are two links on how to do it:
preserved from http://mark.koli.ch/2009/07/howto-include-binary-image-data-in-cascading-style-sheets-css.html
http://www.ietf.org/rfc/rfc2397

Put some of your Javascript at the bottom of the page – Javascript will change the final look of a page– it will introduce content and functionality. Because of that, Javascript files have to come in one by one and be digested before the remainder of the content can be added to the page. That’s a lot of start and stop that impacts the end user. Do them a favor: put as much Javascript as possible into the footer of the page. Even if this implies an extra file of aggregated Javascript has to be built and served, it’s worth it. I’ve looked at plenty of designs where all of the data is sent, but you have a white screen. That comes when page processing happens before you get the complete web page. An HTML web page is a list of instructions– how you mark-up your text. If you put the Javascript low on the page, most of the instructions happen before you get to the stuff that slows you down. You can mitigate that by delivering most of the HTML, CSS and images before Javascript tries to change things. The way to do that is to put the Javascript low on the page.

After you carry out these steps check if they’re helping. I recommend two sites you can use to benchmark your performance:
http://www.webpagetest.org/ — really good at the metrics of success
http://websitegrader.com/ — good at the SEO worthiness of your site.

Look at your keywords. If you want tips on “Dressing For Success” Click Here.

Our partner, WP Engine, has more on the topic

Do you need a hand with speeding up your site?Click Here!

 

Share this page: Sharing Facebook Twitter LinkedIn Copy Text