Using your analytics data to improve web site performance

It’s one thing to look at your reports, its another to start to draw conclusions from your data. Different people will look at the data in different ways. Most often, the reports are looked at by the marketing department/person to see how to better market the site, what Search Engine optimization needs to occur, and measure the effectiveness of their pay-per-click ads. All of these are important uses of the web analytics, but they are not the only uses.

I want to look at a specific example of how I, as a web developer, recently went about making a site faster based upon the results of some analytics reports.

The 80/20 rule of the web

We’ve all heard of the 80/20 rule. On the web it would 80% of people only use 20% of your web site. In reality however, depending upon your site, it could be more like 90/10. A novice web developer will try to optimize all of the pages. An experienced web developer doesn’t bother with those pages that receive only the smallest amount of visitors.

On an Intranet web site I maintain, which tracks the employees’ internal education process, I was looking at the internal search feature. The search results are loaded into the same page. When you count the results from the searches, this page accounted for 24% of all of the page views, making it the second most visited page on the site. (The #1 page is mostly static text, so no real improvements could come from there.)

This means for every 2 people to visit the site, approximately 3 searches were performed in addition to the normal page load. So almost 1 out of every 4 page views was to this internal search engine, and some people were searching multiple times.

This became an area to consider improving. As a little improvement here, would have more effect than a large improvement on a page that is only viewed 1% or less of the time.

Too combat this, I decided to load the results in with AJAX instead of reloading the whole page. This process would improve search response in 3 ways.

Network Connections

Savings in Network ConnectionsEach time a file is downloaded over the Internet, the web browser checks to see if there is a newer version on the server, or if it can use a local file it has already downloaded (cached). Each time it checks to use a local version or downloads a new copy, the server is using up resources, and your end-user has to wait for the files to finish checking/downloading. Because the number of network connections has been reduced, other people can be accessing the site at a faster rate, and the end-user appears to be faster as they don’t have to have their computer do as much work.

The search page referenced 15 external files, plus itself. Of those 12 wouldn’t have to be called anymore. So I was able to reduce the number network connections by 80% for the search results. This was our biggest savings. The number of network connections on a web server is limited by both number and bandwidth speed, therefore the more connections, the slower each connection appears.

Processing of Files

Knowing in advance that it was going to be a popular page, I had tried to make the page as efficient as possible initially. However, with some of the features, like remembering the previous search criteria, because you never left the page, no longer needed, I was able to simplify the page.

The initial page load took about the same amount of time, however when the search results displayed, it appeared to be less than 1 hundredth of a second faster to process the file, about 4.45% faster. While individually it does not yield much savings, when applied as many times as it is with the popularity of the page, the savings adds up, especially during peak times.

Size of Results File Download

file download size savingsI knew I would save a lot in this area. The search interface, consisting of 6 controls, and a variety of other HTML sections, would no longer have to be duplicated in loading the results. This would mean faster downloads, and improved “perception” of the speed of the web page/server. Based on the same sample set of data, there was an 11K reduction in the overhead of the search results, or approximately 14% smaller download. This was for a search with over 100 results. However, many times fewer results (20-50) are returned. So the savings in bandwidth, while still approximately 11K, would be a much higher percentage (30-75%).

Final Thoughts

Because I could determine from our analytics package which files were being processed more, I was able to get the most bang for my buck. This update will take some load off the server, and allow all the pages to benefit from this enhancement. Likewise, almost every visitor will notice a performance improvement in the site because of how often this page is viewed.

This is how a web developer can effectively use web analytics to improve the performance of the web site.

About Walter Wimberly

Walter is a strong believer in using technology to improve oneself and one's business.