When people ask me “what should I track?” my usual philosophy is that tracking should be implemented in such a way as to record in detail anything that is analytically valuable, but also in a way as to make reporting as easy and digestible as possible. In a forthcoming article, I’ll be discussing the last piece of this statement; but I’d like to say a few words about the first piece – the key phrase is “analytically valuable”.
There is such a thing as “overkill” in Web Analytics. It often occurs when there is poor communication between the business requirements group or product owner, and the team responsible for designing and implementing the tags, who decide on a policy of “cover our bases” and tag everything they can think of. This means scores of success events on the site, meaningless variable roll-ups, over-complex campaign tracking codes, and custom link-tracking server requests which slow down performance and result in huge amounts of data within the web analytics tool, the analysis of which would not be worth the effort (literally).
What does “analytically valuable” mean, exactly? It does not refer to data for reporting or KPI’s (that’s what variable structure is for); rather, it refers to a data-set whose comprehensive analysis would be worth the time and resources, producing valuable, actionable recommendations about website design, user behavior, or marketing effectiveness. A deep-dive analysis on usage of the “close” button from different popup windows is probably not worth the money. A team implementing web analytics on a site has to not only ask, “can we capture this behavior?”, but more importantly “can I foresee someone analyzing this data to potentially produce actionable results that would be worth the resources expended?”
This last element is why the overall budget allocated to a website – now and in the future -- should be taken into account when designing a WA implementation. As a web analyst, this may sound like heresy, but in the real world, it’s why mom-and-pop shops choose Google Analytics (if anything) instead of Omniture. Alexander’s Pizza Shop on Main Street might have a website with a menu, directions, and pictures, but obviously would waste its money by implementing NetInsight. It would be dishonest to recommend a state-of-the-art, “measure everything” implementation when it is clear that devoting resources to executing it and analyzing it subsequently would be a waste of money because the website is small-scale and not a significant part of business success. That’s overkill. If it becomes a bigger piece of the business, then a more robust implementation might be warranted.
Even within robust, enterprise-level implementations, measurement overkill is possible. Here are some examples:
· an extensive analysis of a Site Map would probably make no difference either, because they often exist more for SEO than for anything else (though if it’s used more than your navigation, you have a problem!).
· Over-use of campaign classifications: channel, creative, ad type, adgroup, keyword, date-stamp, banner size, link within an email, landing page à all these can be legitimate and useful pieces of a campaign tracking code. But requirements and resources will dictate what is more important; thousands of permutations are possible with all these being tracked in combination, and before implementation, questions should be asked as to whether all of these are useful. A walk-then-run approach might be more appropriate.
· Over-Use of success events: while websites typically can have multiple success events, some implementers take advantage of the availability of dozens of success events to tag almost any action as a separate success event. This leads to opaque reporting because it becomes unclear which of these success events gets included in an overall total effectiveness calculation.
· Over-redundancy in page-naming: some sites record pages as the URL, then pass a user-friendly name into one or two variables, then pass a hierarchically-defined page name into another variable, then pass another variable in an onclick handler recording another version of the page-name of the link clicked on, while also populating site section and hierarchy variables. When a manager wants to see a pages report, they don’t know which one to use: “www.mysite.com”, “homepage”, “hp_”, “mycompany|mainsite|homepage” – you get the picture. And chances are, the numbers for these won’t match up.
There are many other examples, and I’m probably guilty of a few.
I’m not saying that a “measure everything” approach is bad – on the contrary, with analytical resources available it can be a vast asset for value-optimization of the online channel. Rather, I’m saying that a “measure everything” approach has to take into consideration the analytical value of the data, and the resources that would be required to implement and take advantage of it.