Online Marketing: Understanding Web Analytics

As I have mentioned in a previous post, the online market in Romania is still lacking skills and knowledge related to online marketing, a normal state of being if we take into account the fact that we’re talking about an emergent market. One of the main issues of the local online market is the lack of understanding of some of the tools that help us do proper online marketing, and today I want to touch upon web analytics.

I have witnessed and participated in countless discussions and debates related to web analytics tools, the results they offered and how these were so different from each other, when theoretically they should have all shown the same numbers. The logic behind this debate is pretty simple: the number of visitors of a site in a specific period of time should be the same, right? Well… the answer is not really that easy, nor that simple.

Stone Temple Consulting, an American consulting company, decided to pick up on this debate topic from a SEO blog and performed in 2007 a study that used several web analytics tools to monitor traffic on several websites in parallel and then compare the results and figure out if differences in results showed up, how big they were and what could have been the reasons why the results between various web analytics tools are different or not.

What they found out was that… well, I will just quote their conclusions, because I myself could not have said them better (will just bold various key phrases for emphasis):

“1. Web analytics packages, installed on the same web site, configured the same way, produce different numbers. Sometimes radically different numbers. In some cases the package showing the highest numbers reported 150% more traffic than the package reporting the least traffic.

  1. By far the biggest source of error in analytics is implementation error. A Web analytics implementation needs to be treated like a software development project, and must be subjected to the same scrutiny and testing to make sure it has been done correctly.

Note that we had the support of the analytics vendors themselves in the implementations done for the 2007 Web Analytics Shootout, so we believe that this type of error was not a factor in any of the data in our report, except where noted.

  1. Two other major factors drive differences in the results. One of these is the** placement of JavaScript on the site**, as being placed far down on a page may result in some users leaving the page before the JavaScript can execute. Traffic that is not counted as a result of the JavaScript can be considered an error, because the data for that visit is lost (or at least the data regarding the original landing page and, if the visitor came from the search engine, the keyword data would also be lost).

The other factor is differences in the definition of what each package is counting. The way that analytics packages count visitors and unique visitors is based on the concept of sessions. There are many design decisions made within an analytics package that will cause it to count sessions differently, and this has a profound impact on the reported numbers.

Note that this should not be considered a source of error. It’s just that the packages are counting different things, equally well for the most part.

  1. Page views tend to have a smaller level of variance. The variance in ways an analytics package can count page views is much smaller. JavaScript placement will affect page views, but differences in sessionization algorithms will not. Simply put, if the tracking JavaScript on a page executes, it counts as a page view.

  2. There are scenarios in which these variances and errors matter, particularly if you are trying to compare traffic between sites, or numbers between different analytics packages. This is, generally speaking, an almost fruitless exercise.

  3. To help address these accuracy problems, you should calibrate with other tools and measurement techniques when you can. This helps quantify the nature of any inaccuracies, and makes your analytics strategy more effective.

  4. One of the basic lessons is learning what analytics software packages are good at, and what they are not good at. Armed with this understanding, you can take advantage of the analytics capabilities that are strong and reliable, and pay less attention to the other aspects. Some examples of where analytics software is accurate and powerful are:

  • A/B and multivariate testing
  • Optimizing PPC Campaigns
  • Optimizing Organic SEO Campaigns
  • Segmenting visitor traffic
  1. There are many other examples that could be listed. The critical lesson is that the tools are not accurate, but their relative measurements are worth their weight in gold.

In other words if your analytics package tells you that Page A converts better than Page B, that’s money in the bank. Or if the software tells you which keywords offer the best conversion rates, that’s also money in the bank. Or, if it says that European visitors buy more blue widgets than North American visitors – you got it – more money in the bank.

So enter the world of analytics accuracy below, and hopefully, you will emerge with a better appreciation of how to use these tools to help your business, as I did.”

(source: http://www.stonetemple.com/articles/analytics-report-august-2007.shtml)

I am sure that this quite informational material will not quench the heated debates about differences between various web analytics tools, such as Trafic.ro, SATI or Google Analytics, probably the most widely used tools on the Romanian online market today, and why one or another sucks because “it is not accurate.” For those to whom two or more web analytics tools are not enough to measure the audience of their web site, I also recommend you the new Yahoo! Web Analytics, the former IndexTools, which has become a free tool after the acquisition by the Internet giant (irony here for those unaware).

However, the idea is to try to use a single web analytics tool because, as the Stone Temple Consulting study showed, comparing two or more web analytics packages and their results is a “fruitless excercise.” Just find the tool that offers you the best information/data that you need in order to grow your web site and use that tool to study your audience and the trends and changes that happen with it over time. In the end, focusing on spending your time on a single data set and using it to generate actionable objectives would be better invested time than waisting it comparing results from various tools and pondering continuously why these do not match.