Friday, February 26, 2010

The Difficulty in Tracking Web Traffic

There is a lot of debate over how to accurately measure web traffic. While the average consumer isn't super worried about how accurately web traffic is measured, those who purchase and sell advertising space care a great deal about these numbers. If a site's traffic is not being fully measured, that site will not be able to charge advertisers as much as the advertisers would be willing to pay if the traffic was recorded at higher levels. And advertisers obviously want to be able to have confidence in the size of their target audience before shelling out money on advertising agreements.

Web-traffic measurement, despite recent advances, remains fraught with conflicting numbers. The Internet's inherent accountability, stemming from the digital trace left by every Web site visit, has spawned a multitude of measures, but little clarity.

For big sites such as Facebook and Yahoo, the differing numbers might matter chiefly for bragging rights. Smaller sites, though, say that mismeasurement of their traffic could cost them when advertisers seeking a broad reach dismiss them because of their seemingly paltry audience sizes.

Bob Bowman, chief executive of MLB Advanced Media, the Major League Baseball digital arm, goes further. "Our numbers are wildly different from what comScore and Nielsen are showing, to a point where it's materially damaging to our business."

ComScore and Nielsen say such criticisms reflect a misreading by Web sites of their own user base, which is reflected most often as a tally of monthly unique visitors. Online publishers typically can gauge their own traffic through logs on their Web servers recording every request for the site, or by assigning unique tags, known as cookies, to each Web browser that visits the site.

Both of these techniques, though, tend to overestimate visitors, for several reasons. For instance, the same person might visit from home, from work and from a mobile device, and be counted as a different user each time. (A computer also might be shared by several people, which could lead to an undercount.) Another problem arises because many people frequently delete their cookies, so they are counted multiple times. And then there's the problem of search engine bots that view web pages multiple times, further inflating the traffic count.

ComScore recently introduced a new tool to address some of these issues. Clients that add a comScore cookie to their site are now having their audiences counted using both comScore's panel and their own direct traffic counts. ComScore starts by scrubbing the cookies of bots and international visitors. Then it uses its own panel data to attempt to correct for cookie deletion and repeat visitors.

While several ComScore clients have reported jumps in traffic from this new approach, many smaller sites still complain that they are being left out. ComScore has offered to measure nonclients' traffic for a $5,000 fee that covers six months. This has made small firms feel that they need to pay to prove the true level of traffic their sites receive.

The issue of measuring web traffic accurately is one that will likely be fought by all parties who have a vested income in the reported numbers. Hopefully, new methods that can be more universally agreed upon to measure web traffic will be created.

No comments: