Hi everyone,
I found on hubspot.com a very interesting infographic (see below) that shows the history of Google algorithm major changes from its birth.
Note that those are only the major ones, because apparently Google changes its search algorithm around 500–600 times each year.
Why is that? Every time Google found a weakness in its ability to deliver relevant, high-quality search results (or I guess when "marketers" found the way to exploit that weakness), it made fixes to address it.
Here's my thought.
Given that:
1. Noone knows Google search algorithm
2. Whenever someone identifies a flaw in it and tries to "cheat", Google will make fixes
3. The ultimate goal for Google is to deliver relevant, high-quality search results
Is it SEO really different from building a high-quality-content, rationally-structured website that online users like and therefore link to it?
Is it building a website different from writing a book or making a movie? if the book is great, everyone will love it and look for it in the bookstores (which will place it in front of the entrance door). If it's not, are there really effective tricks to force people purchase/read it?
Comments are welcome!
Jacopo
1 comment:
Related to Jacopo's post and to the frequency at which search algorithms change, I found an interesting article on the impact that the roll-out of Panda 4.0 by Google in mid-May had for eBay
http://www.marketingtechnews.net/news/2014/jun/02/google-takes-on-ebay-will-other-big-brands-follow-suit/
Apparently, with the update in its algorithm, Google penalized eBay, causing a significant drop in SEO keywords ranking. Can this big negative impact have been caused only by the algorithm update, or there should be other reasons for such a drop in ranking?
How should Google manage the trade-off between updating very often its algorithms to deliver relevant search results to users, and limiting the negative impact these changes can have especially for smaller website, but also for big brands?
Post a Comment