You’ve probably already heard the story, and are currently very pissed off. If not, here is a short summary:
Facebook manipulated the algorithms that control what users see in their News Feeds for almost 700,000 of its users to determine whether an increase in negative tone leads users to write negative posts, or vice versa. Turns out, users are affected in a minor way.
But is this anger really justified? What we do know for sure is the fact that the experiment was completely legal. I think the issue of whether it is ethical or not stems from the fact that laws have not kept up with the way the internet works. People also put social media in a very different league from search, as the content is so much more personal. Thus, the idea that Facebook can do what every other media site out there does - tweak Newsfeed to ensure a higher rate of engagement - is more morally egregious than, say, if Amazon did it. Below are a few reasons why I think this was not that black and white:
1) Advertisers do this all the time: they try and manipulate user sentiment to click on ads, engage with content and buy specific products. The whole concept of what constitutes a "Brand" is emotional manipulation.
2) The effect was really really tiny. According to a Pew article, "As reported by the authors, the number of negative words used in status updates increased, on average, by 0.04% when their friends’ positive posts in news feeds were reduced. That means only about four more negative words for every 10,000 written by these study participants. At the same time, the number of positive words decreased by only 0.1%, or about one less word for every 1,000 words written. (As a point of reference, this post is a little more than 1,000 words long.)"
With such a huge user base, small differences can be statistically significant.
3) The experiment stems from the original idea that happy content is what is making people sad and leave FB. Single people get depressed after seeing all the happy couples on FB, and people struggling to make it professionally are in constant envy of those who seem to "have it all." Newsfeed is important to user engagement on the site, thus sentiment around posts is strongly related to Facebook's overall business model. Facebook HAS to come up with a good way of showing you specific content from a large pool of information. Thus they want you to see the best content they can, which will make you more engaged with the site. This means they have to, have been, and will be, running tests like these.
Facebook manipulated the algorithms that control what users see in their News Feeds for almost 700,000 of its users to determine whether an increase in negative tone leads users to write negative posts, or vice versa. Turns out, users are affected in a minor way.
But is this anger really justified? What we do know for sure is the fact that the experiment was completely legal. I think the issue of whether it is ethical or not stems from the fact that laws have not kept up with the way the internet works. People also put social media in a very different league from search, as the content is so much more personal. Thus, the idea that Facebook can do what every other media site out there does - tweak Newsfeed to ensure a higher rate of engagement - is more morally egregious than, say, if Amazon did it. Below are a few reasons why I think this was not that black and white:
1) Advertisers do this all the time: they try and manipulate user sentiment to click on ads, engage with content and buy specific products. The whole concept of what constitutes a "Brand" is emotional manipulation.
2) The effect was really really tiny. According to a Pew article, "As reported by the authors, the number of negative words used in status updates increased, on average, by 0.04% when their friends’ positive posts in news feeds were reduced. That means only about four more negative words for every 10,000 written by these study participants. At the same time, the number of positive words decreased by only 0.1%, or about one less word for every 1,000 words written. (As a point of reference, this post is a little more than 1,000 words long.)"
With such a huge user base, small differences can be statistically significant.
3) The experiment stems from the original idea that happy content is what is making people sad and leave FB. Single people get depressed after seeing all the happy couples on FB, and people struggling to make it professionally are in constant envy of those who seem to "have it all." Newsfeed is important to user engagement on the site, thus sentiment around posts is strongly related to Facebook's overall business model. Facebook HAS to come up with a good way of showing you specific content from a large pool of information. Thus they want you to see the best content they can, which will make you more engaged with the site. This means they have to, have been, and will be, running tests like these.
No comments:
Post a Comment