The Google Panda algorithm – a major rewrite of its core
algorithm which launched in early 2011 – has had some unintended consequences
in the web. While the intention was to reduce the abuse to link farms,
scrapers, and paid linking, ZD Net’s recent article notes that the result has
been a mass reduction of hyperlinks – even the legitimate ones.
Panda is turning what used to be a best practice of having
lots of links from other sites into a warning signal that your site might be
over-optimized and deceitful. Beyond incentivizing blogs and other sites to
remove legitimate hyperlinks for fear of being tagged as illegitimate, it has
also created a new generation of “page-rank assassins,” where a website can
sink a competitor’s website by paying for link farms to link to the
competition’s website. Google realizes the bad links, sinks the competitor’s
search ranking, and you’ve effectively hobbled your competition.
ZDNet argues that the real issue with Panda is its inability
to decipher high quality from low quality, and copied content from original
content. In some cases, “scaper” sites (sites which copy content from other
websites and add ads, etc) are in fact scoring higher than the original sites
themselves. Some firms have gone as far as to issue legal warnings to people to
stop linking to their site! (http://boingboing.net/2012/07/20/seo-company-rep-says-its-ill.html)
While I understand Panda’s goal of reducing link farms and
punishing the black hat perpetrators, it seems that Panda could use additional
intelligence around original vs. copied links. Any ideas on how Panda could be
improved? Or how Google could stop page-rank assassins?
No comments:
Post a Comment