Tech giant Adobe recently conducted a study on website
visits, which found that more than a quarter of web traffic (~28%) demonstrated
strong “non-human signals,” indicating that they likely came from bots,
scrapers or click farms.
Adobe’s investigation covered a “handful” of travel, retail
and publishing industry clients and specifically sought to uncover how much of
their web traffic had non-human characteristics. The Company posits that with
the bots and click farm data currently included in web traffic reports, digital
marketers and advertisers aren’t getting a clear picture of site visitors and
product purchasers and therefore have unrealistic perceptions and expectations
of their audience. However, “by weeding out that misleading data, brands can
better understand what prompted consumers to follow their ads and ultimately
visit their websites and buy their products.”
The Wall Street
Journal article reporting on Adobe’s study goes on to explain that the findings
will help ad buyers exclude non-humans from future targeting efforts by
allowing them to remove their representative cookies or unique web IDs.
Importantly, Adobe understands the value of such a
capability, is but not yet monetizing a product to flag non-human web traffic.
Instead, the Company plans to build a function into its current tools. While
it’s not an altruistic effort by any stretch, it’s nice to see a Company attempting to use its considerable scale and influence to help improve internet
transparency.
No comments:
Post a Comment