Shoot Out! Measuring the Performance of Web Analytics Software

What happens when you run multiple web analytics software tools side by side to test and compare their performance? Do they give different results and if so, why? Which one should you trust? Does one package always report higher or lower numbers than the other? How do they compare when measuring visitors, unique visitors and page views? A recent study shows it doesn’t hurt to analyze the tools that track and measure your data.

Stone Temple Consulting set out to test 7 web analytic packages on 4 websites. As many as 6 tools were run on a website. Called the 2007 Analytics Shoot Out, results of the study are being presented in two stages. The interim report was officially released at the Emetrics Summit in San Francisco on May 6, 2007. A final, more comprehensive report will be released in July of 2007.

The following tools were tested:

1. Clicktracks
2. Google Analytics
3. IndexTools
4. Unica Affinium NetInsight
5. WebSideStory HBX Analytics
6. Omniture SiteCatalyst
7. WebTrends

The four participating websites were:

AdvancedMd.com
Citytowninfo.com
Homeportfolio.com
Toolpartsdirect.com

Contributors were:

1. John Biundo of Stone Temple Consulting
2. Jonah Stein of Alchemist Media
3. Rand Fishkin of SEOmoz
4. Jim Sterne of Emetrics

In addition to describing the methodology and goals for the ShootOut, the first report raises initial concerns over the effects of the removal of first and third party cookies, and how this may cause inaccurate data.

Cookie deletion rates are of great concern when evaluating web analytics. Every time a cookie is deleted it impacts the visitor and unique visitor counts of the tool. In particular, counting of unique visitors is significantly affected. If a user visits a site in the morning, deletes their cookies, and then visits again in the afternoon, this will show up as 2 different daily unique visitors in the totals for that day, when in fact one user made multiple visits, and should be counted only as one unique visitor.

Configurations were evaluated. The report found “significant differences in the traffic numbers revealed by the packages.” Page view analysis was conducted and discussed in the report. Various data collection areas were measured. Vendors had the chance to present their product’s features and benefits.

The report concludes with,

Implementation of an analytics package requires substantial forethought and planning. And, when you are done with that, you have to check, and recheck your results, to make sure they make sense.

You can view the first phase, and intitial findings overview, in 2007 Web Analytics Shootout – Interim Report

One thought on “Shoot Out! Measuring the Performance of Web Analytics Software

  1. wow, Finally someone is going to give the difinitive answer on what web tracking software I should use…

    Though the fact that Google Analytics is free is probably one of the biggest deciding factors for me! ;-)

Comments are closed.