At the Semphonic XChange web analytics conference, one of the representatives from Yahoo! Analytics made the bold prediction “JavaScript will be gone in 5 years.” The context was a huddle session describing universal tagging strategies, with the idea that data-collection processes, over time, will have to be managed and controlled by in-house, local warehousing, rather than the current web analytics solutions’ “cloud” based on JavaScript tagging.
This is an intriguing prediction, and there are many arguments for this point of view. JavaScript tagging has been around, in force, for let’s say 6 years. Before then, web analytics was a matter of crunching server log files, either locally hosted in a data-warehouse, or delivered to third-party vendors who had internal proprietary software by which to analyze these files and produce reports. With Omniture, HBX, WebTrends, NetInsight, CoreMetrics, and Google, the web analytics paradigm shifted to JavaScript tags, executing image-requests on the page and storing the data externally, which is accessed through an online interface. This has taken off, and I see no reason why this model will not continue to be vibrant in the next few years.
However, and inspired by this comment from Yahoo!, there are valid arguments why companies may start to shift slowly away from the JavaScript tagging model. Firstly, there’s Mobile. Unless mobile technologies universally enable JavaScript encoding, large chunks of traffic to a website will go un-noticed. Non-JavaScript solutions for Mobile tracking are, of course, possible, but these are band-aids used to essentially trick the solution into accepting data through raw image-requests. Such implementation requires many resources, and is often not well-integrated into the overall measurement solution. Not to mention persistent cookies on a mobile device…
Secondly, there is more and more interest in data-integration. Offline data is used to integrate with online data for marketing, merchandizing, and optimization purposes. Purchase transaction data (always more accurate in a back-end system than in a web analytics solution), call-center data, catalogue purchases and registrations – basically, all the tools traditionally managed by a BI team, are increasingly being married to online web behavior. This is all good, but at the same time, does it really make sense to push data to an Omniture or WebTrends if the sole purpose is to pull it back to your database?
Thirdly, there is the increasing issue of cookie deletion and cookie rejection. Whether you’re using first-party or third-party cookies, the elephant in the room suggests as much as a 3-7% rejection rate, depending on what kind of site you have. Media sites have a smallish rejection rate, but it tends to be higher for retail and financial services sites, where visitors are more naturally inclined to increase their privacy settings (and imagine what this rate might be for adult entertainment sites!). Solutions like Omniture can still track visitors based on IP Address or other hints, but customized tracking remains problematic. Cookie-deletion is also a problem: Norton AntiVirus Software now labels DoubleClick, Omniture, WebTrends, and other web measurement cookies as a “non-virus tracking cookie”, with “low risk”, but nevertheless recommends their deletion after a scheduled computer scan. This obviously has an impact on visitor measurement over time. My out-of-the-box Norton Antivirus software runs once a week.
JavaScript-based tagging solutions for web measurement are not going away any time soon, but in five years, the web analytics industry may well start to go back to where we were 6 years ago, using and integrating internal log files and databases. I can only imagine and wonder where we would be now if there were no such thing as JavaScript tagging, and the solutions for log-file integration had been given 6 years to advance.
Recent Comments