Monday, July 07, 2014

You may have been part of Facebook Experiment .

Last week, Facebook reported that it had tinkered with about 700,000 users’ news feeds as part of a psychology experiment conducted in 2012. The study found that users who saw more positive content were more likely to write positive posts, and vice versa.

The experiment was the work of Facebook's Data Science team, a group of about three dozen researchers with unique access to one of the world's richest data troves and run hundreds of test: the movements, musings and emotions of Facebook's 1.3 billion users. One published study deconstructed how families communicate, another delved into the causes of loneliness. One test looked at how social behaviors spread through networks. In 2010, the group measured how "political mobilization messages" sent to 61 million people caused people in social networks to vote in the 2010 congressional elections.

The experiments ignited outrage from people who found it creepy that Facebook would play with unsuspecting users’ emotions. Because the study was partnered with academic researchers, it also appeared to violate long-held rules protecting people from becoming test subjects without providing informed consent.

However, Facebook are thriving petri dishes of social contact, and many social science researchers believe that by analyzing our behavior online, they may be able to figure out why and how ideas spread through groups, how we form our political views and what persuades us to act on them. And also only by understanding the power of social media that we can begin to defend against its worst potential abuses.
Over the last few years, Facebook has expanded what it calls its Data Science team to conduct a larger number of public studies. The company says the team’s mission is to alter our understanding of human psychology and communication by studying the world’s largest meeting place. So far, it has produced several worthy insights.

In 2012, the data team published a study that analyzed more than 250 million users; the results shot down the theory of “the filter bubble,” the long-held fear that online networks show us news that reinforces our beliefs, locking us into our own echo chambers. Like the new study on people’s emotions, that experiment also removed certain posts from people’s feeds.

Much of Facebook's research is less controversial than the emotions study, testing features that will prompt users to spend more time on the network and click on more ads. Other Internet companies, including Yahoo Inc., Microsoft Corp., Twitter Inc. and Google Inc., conduct research on their users and their data. Google has acknowledged running about 20,000 experiments on its search results every year. It once tested 41 different shades of blue on its site, each color served to a different group, just to see which hue garnered the most engagement from users. In the same way, YouTube offers different suggested videos to different people, to get them to watch more videos. Amazon manipulates its store design – offering different deals and layouts - so that people spend more money.


The responses to the experiment were overwhelmingly negative. But, there is little likelihood that Facebook and others will stop this kind of A/B testing. Still, there’s hope that these companies would become more transparent in experiment process. That could include reviewing more risky or controversial tests with an independent ethics board, or allowing users some way to find out if they’ve quietly been placed into an experiment. Facebook and other companies could also provide some open access to privacy-protected anonymized data to outside researchers. There are benefits to understanding humanity locked in the data of these big tech companies that might never be researched if they don’t have profit potential.

Main Source: http://online.wsj.com/articles/facebook-experiments-had-few-limits-1404344378

No comments: