Monday, September 29, 2014

Tricking Google for Good

Last week's class discussion went into great detail about how Google has optimized its algorithms / "pigeons" over the years to bring us the most relevant content possible. It has done so not just in a passive manner, but has also gone as far as to punish websites for trying to trick Google search into getting a higher page rank for their site.
This opens up 2 key issues for me, with the first tying into the second:
  1. How do we make sure we don't live in a "information bubble" but still get relevant results?
  2. Where do we stand on specific cases in which online activists use unauthorized tactics to further their cause?
The first issue is coined as the "filter bubble" and is best summarized as follows: A filter bubble is a result of a personalized search in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behavior and search history) and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles.

For the second issue, I'd like to point out to this blog post from 2013 - which makes it a very very old post in terms of Google's "weather updates". One great example is as follows:

So what if we used Google to find popular search phrases that lend themselves to feminist lessons? The #1 autocomplete for “Why Do Women” is the phrase “Why Do Women Cheat?”. It’s something people desperately want to know and, currently, many have tried to answer. And as you might imagine, many of the results for this search are pretty sexist. But what if a feminist blog wrote a response to this question (light on the feminist jargon, heavy on feminist ideals) and, using all the appropriate Search Engine Optimization techniques, managed to climb the ranks of the search results? Quite a few non-feminists could be reached by progressive messages.

As internet activist Eli Pariser pointed out, "a world constructed from the familiar is a world in which there’s nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas." So maybe we should pressure Google to engage in positive discrimination for select causes (as above) to help reverse the well-meaning monster it created?

No comments: