Facebook, the world’s biggest social networking site, with more than 1.50 billion monthly active users doesn’t just need users to make money. It needs users that are active and engaged. Facebook already knows whether you are single or dating, the first school you went to and of course all your likes and interests. It gathers that kind of information by looking at your daily Facebook activity, analyzing the posts and pages that you like and by running psychological experiments.
Yes, what many of us feared is already a reality: Facebook is using us as lab rats and has been conducting social experiments on its users. And yes, chances are you’ve involuntarily taken part at some point. Experiments are happening all the time at the company and that every Facebook users have been part of one at some point.
Here are few experiments were done by Facebook data scientists on users (possibly on you), sometimes in collaboration with academic researchers, that are now known to the public because they have been published.
Study 1: Massive-Scale Emotional Contagion
When: 2012
Number of people involved: 689,003 users
What Facebook wanted to find out: To test the emotional impact of users using the social network i.e., whether more positive or negative comments in a Facebook newsfeed would impact how the user updated their own page.
How they did it: For one week in January 2012, Facebook data scientists manipulated the news feeds of almost 700,000 users, showing some of them more happy and positive updates and others sadder than average and negative updates. All to see how it affected the users’ moods.
And when the week was over, these manipulated users were more likely to post either especially positive or negative posts themselves. Those shown more negative posts posted more negative comments and vice versa.
What Facebook found out: People’s emotions can indeed be affected by what they’re exposed to on Facebook.
Did Facebook violate your privacy? Even if this type of manipulation can’t be classified as a privacy violation, it definitely seems unethical. The study was described as “disturbing” by the public, after all, it involved hundreds of thousands of users unknowingly participating in a study that may have made them either happier or more depressed than usual.
Study 2: Exploring Requests for Help on Facebook
When: Summer 2012
Number of people involved: 20,000 users
What did Facebook want to find out: Who asks for something on Facebook?
How they did it: For two weeks in July and August 2012, Facebook researchers singled out status updates with requests in them, like “What movie should I watch tonight?” “Is it okay to eat canned food that expired in 2007?” or “I need a ride to the airport.” They were interested in those regularly asking for help rather than whether they actually got it.
What Facebook found out: Users who visit Facebook less often, but who have a lot of friends on the network, are most likely to ask for help with stuff.
Did Facebook violate your privacy? No. The updates the researchers analyzed are public ones, hence, no surprise that someone’s collecting and studying them. And there’s really no invasion of privacy here.
Study 3: Self-Censorship on Facebook
When: July 2012
Number of people involved: 3.9 million users
What did Facebook want to find out: How many people hold back from blasting the network with their thoughts on something?
How they did it: 17 days in July 2012, Facebook tracked every entry of more than five characters in a comment or compose box that didn’t get posted within 10 minutes.
What Facebook found out: 71% of the users “self-censored,” drafting comments that they never posted. Many others edited their posts before sending them out to the social network.
Did Facebook violate your privacy? Probably. The fact that Facebook has a record of not just what you post, but also what you don’t post, is at the very least disturbing.
Study 4: The Role of Social Networks in Information Diffusion
When: August/October 2010
Number of people involved: 253 million users (half of all Facebook users at the time)
What did Facebook want to find out: How does information spread on Facebook?
How they did it: For seven weeks in August/October 2010, facebook researchers randomly assigned 75 million URLs a “share” or “no-share” status. The links included from news articles, job offers to apartments for rent, or news of an upcoming concert – Any kind of links that Facebook users share. Those with the “no-share” status would disappear in your friends’ news feeds. Researchers then compared the virality of links that were allowed to be seen with those that weren’t. Facebook researchers wanted to know if the censored information would still find a way to spread.
What Facebook found out: Unsurprisingly, users are more likely to spread the information that they see their friends sharing. Also, according to the study, your distant friends are more likely to expose you to new information than your close friends, as judged by your likelihood of sharing it after seeing it.
Did Facebook violate your privacy? Obviously. Just imagine how much information was deliberately censored by Facebook during this study. Hopefully, it was nothing important. And the fact that they very closely tracked and monitored what you posted and how it affected your friends seems dubiously ethical as well.
Study 5: Selection Effects in Online Sharing
When: Two months in 2012
Number of people involved: Over 1 million users
What Facebook wanted to find out: Does broadcasting your intention to buy something will have an effect on your friends’ buying interests?
How they did it: Users who claimed: “Facebook Offers” were put into two groups. One group had the offers they claimed auto-shared so that friends would see it in their News Feeds. Users in the other group were graciously given a button to click to choose whether they wanted to broadcast the offer claim to their friends.
What Facebook found out: Friends are more likely to also claim the offer when you actively decide to share it with them. But when it comes to the sheer numbers game, more offers get claimed when everyone in your friend’s list gets to see them.
Did Facebook violate your privacy? Yes. Auto-sharing is invasive and frankly creepy. The study’s results show that only 23% of the users, who had given the option to share, decided to share it. There is a clear business case for Facebook finding out how to get offers claimed as its key to revenue.
Study 6: The Spread of Emotion via Facebook
When: Sometime before 2012 (when it went public)
Number of people involved: 151 million users
What did Facebook want to find out: Does your emotional state affect your friends?
How they did it: This was the precursor to the “emotional contagion” study. In this study, they looked at 1 million users’ status updates, rating them as positive or negative based on terms used, and then looked at the positivity or negativity of the posts of those users’ 150 million friends.
What Facebook found out: During the three days of running this study, the researchers found that the friends of the users with positive updates were suppressing their negative posts and vice versa. If you post something positive on Facebook, one out of every 100 friends (who wouldn’t have otherwise, according to the study) will do the same within 3 days.
Did Facebook violate your privacy? Could go either way. Assessing the emotional tone of status updates on Facebook is pretty mundane. However, it did lead these researchers down the path of trying to see if it was possible for Facebook to actively manipulate the emotions of individual users based on which of their friends’ posts they expose them to.
Study 7: Social Influence in Social Advertising
When: 2011
Number of people involved: 29 million
What did Facebook want to find out: Do ads work better on you when your friends’ names appear next to them, endorsing them?
How they did it: They showed the users two different types of ads — with and without endorsements like “John Cena liked this” — and then measured how many clicks those got.
What Facebook found out: The stronger your bond with the person endorsing the ad, the more likely you are to click the link.
Did Facebook violate your privacy? No. This is the kind of study that you’d expect Facebook conducting to improve their marketing strategies. The Clear business case for making ads work better.
Study 8: Social Influence and Political Mobilization
When: U.S. midterm elections of 2010
Number of people involved: 61 million users over the age of 18
What did Facebook want to find out: Can Facebook encourage people to vote?
How they did it: In 2010, just before the midterm elections, Facebook researchers planted an “I voted” button at the top of the users’ news feeds, along with the information about their polling place. Some users could also see the names of their friends who had clicked the button. The control group got no prompt to vote. The researchers then checked public voting records to see which of the users actually voted.
What Facebook found out: Peer pressure works. Users were more likely to click the “I Voted” button if they saw their friends’ names next to it. Researchers found that people who got the “I Voted” message in their News Feed were 0.39% more likely to have actually voted, and were more likely to have voted if their friends’ names appeared. Those seem like small percentages, but with the number of people involved in the experiment, that makes 340,000 possible votes that wouldn’t have otherwise happened. It appears that Facebook can actually encourage people to vote.
Did Facebook violate your privacy? Maybe not, but it seems highly unethical. Getting people to perform their civic duty and vote is an admirable thing. None of the users realized that they were part of this experiment or that Facebook would look up their names in voting records; they did come up with a privacy-preserving way to do that. There wasn’t an obvious business case for this one; it was a pure can-we-really-do-this study.
Conclusion:
Facebook was able to carry out these experiments because all the users have agreed to the company’s terms and conditions prior to creating an account on Facebook, constituting informed consent for these research. In the company’s current terms of service, Facebook users relinquish the use of their data for “data analysis, testing, [and] research.”
These experiments by facebook also demonstrate the power we’ve all given to Facebook purely by using the service.
How would you choose to protect your online presence? Share your thoughts with us in the comments below!