top of page


Facebook's Actions Violate Security Rights of Users

Facebook Oblivious to its Ethical and Corporate Responsibilities

Recently it was revealed that when you agreed to Facebook's terms and conditions, you were agreeing to become a subject in a psychology experiment. Facebook permitted an academic research team to conduct an experiment on a huge number of Facebook's users back in 2012. The researchers adjusted the contents of Facebook timelines for nearly 700,000 users so that either positive or negative news dominated. They found that positive news spread positive responses, and negative news spread negative responses.

We already know of Facebook’s use of personal data and most people are well aware of how Facebook hopes to monetize use of the system. The disclosure about the psychological experiments shows that Facebook has no boundaries when it comes to the ethics of manipulating users. In fact, the company seems surprised anyone would even think ethics were relevant here. The controversy over the project highlights the delicate line in the social media industry between the privacy of users and the ambitions—both business and intellectual—of the corporations that control their data.

Facebook’s response given through a media spokesperson was to rationalize the use of the data by saying it conducted the study during one week in 2012 and none of the data used was associated with a specific person's Facebook account. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives, and all data is stored securely.” The media spokesperson was clearly worried that Facebook was being accused of violating privacy or security in some way and had no sense at all this might be an ethical issue.

Why is it an ethical issue? First, using data collected from Facebook users without their explicit consent not only violates their privacy rights but is an example of corporate irresponsibility. Corporations need to be aware of their ethical obligations that include respecting the rights of users and exercising due care in the collection of data. None of this is possible without first asking for consent to use data especially for some vague psychological experiment.

Sometimes it seems as though Facebook believes it is above the law. I label their actions as corporate hubris. I also believe Facebook has morphed into an entity that places user rights second to the desires of sponsors and groups that want the data for their own purposes. There is no doubt in my mind that the company would benefit from ethics education on the use of data including the responsibilities of the company to its users and the public at large. An ethics course should be broader and address what it means to be a social media giant in today's interconnected world and how to be responsible and accountable for its actions.

Companies like Facebook, Google and Twitter rely almost solely on data-driven advertising dollars. As a result, the companies collect and store massive amounts of personal information. Not all of that information can be used for advertising—at least not yet. In the case of Facebook, there is an abundance of information practically overflowing from its servers. What Facebook does with all its extra personal information—the data isn't currently allocated to the advertising product—is largely unknown to the public.

Facebook's Data Science team occasionally uses the information to highlight current events. Recently, it employed it to determine how many people were visiting Brazil for the World Cup. In February, The Wall Street Journal published a story on the best places to be single in the U.S., based on data gathered by the company's Data Science Team. Those studies have raised few eyebrows. The attempt to manipulate users' emotions, however, struck a nerve.

"It's completely unacceptable for the terms of service to force everybody on Facebook to participate in experiments," said Kate Crawford, visiting professor at MIT's Center for Civic Media and principal researcher at Microsoft Research.

Ms. Crawford said it points to broader problem in the data science industry. Ethics are not "a major part of the education of data scientists and it clearly needs to be," she said.

We already know that Facebook possess vast powers to closely monitor, test and even shape our behavior, often while we’re in the dark about their capabilities. What this particular incident tells us about Facebook is that it has no boundaries when it comes to the ethics of manipulating users.

Blog posted by Steven Mintz, aka Ethics Sage. On July 8, 2014. Dr. Mintz is a professor in the Orfalea College of Business at Cal Poly, San Luis Obispo. He also blogs at:

Follow Me
  • Grey Facebook Icon
  • Grey Twitter Icon
  • Grey Instagram Icon
  • Grey Pinterest Icon
bottom of page