News that Facebook manipulated content of nearly 700,000 users continues to make waves, and now a Democratic senator has asked the Federal Trade Commission to investigate whether the company and researchers properly notified and/or debriefed users that they were being used as experimental subjects.
The 2012 experiment tested whether users posted higher levels of positive or negative comments after Facebook removed content with positive or negative emotions. The bottom line, according to the Proceedings of the National Academy of Sciences: Yep, fiddling with emotion in news feeds can cause a little emotional change in the reader.
It’s no Milgram or Stanford Prison experiment, to be sure, but it raises questions of ethics by experiments, the watchful eye of Institutional Review Boards that consider research ethics, and relationships between academic institutions and the corporate world. Cornell’s IRB didn’t weigh in because Facebook did the research, not the Cornell researchers who only access to results. Since its publication, the National Academy of Science has expressed concern has expressed concern about the experiment, and Facebook has apologized for another in a series of lapses.
This post isn’t about research ethics, however. It’s about journalism ethics.
Facebook doesn’t think of itself as journalism or its employees as journalists. But when your site’s main feature is called a “News Feed” and your site describes posts as “stories,” then you’ve moved into the world of journalism.
Essentially, Facebook’s genius is that it is among the world’s largest news organizations—but without reporters. As the star of the Web 2.0 world, it makes its billions by being the host of news produced by users for users.
The millions of people who use the site expect to see precisely what their “friends” have posted, in chronological order – or as “top stories,” another term dripping with journalistic implications as human-created algorithms decide what stories are most important for users to see.
So when Facebook starts editing what its readers see, it is in the journalism business.
Facebook has long manipulated content for better and worse. Its software removes illegal and offensive content for legal and ethical reasons. But it also messes with content for financial reasons. People who use Facebook to communicate with groups by using Facebook pages are realizing that Facebook doesn’t give you access to everybody in your group. Facebook’s “Boost Post” plan requires page owners to pay to reach everyone who has liked the page—more money for Facebook, but with the downside that a Page owner cannot freely reach everyone who has freely chosen to affiliate with a page. Many marketers are unhappy with Boost Post in principal, as well as its effectiveness. (Would anyone like to argue that if I choose to “like” a page, then I should expect to receive the content from that page? How do I know what I’m missing otherwise?)
This post isn’t about advertising ethics, however. (When something online is free, you’re not the customer; you’re the product.) It’s about journalism ethics.
The ethical malpractice of this experiment is when Facebook calls its posts “stories” and “news” – but then manipulates that “news.” Simply put, people who come to Facebook expect to see everything in their friends’ feeds. When Facebook secretly takes away what its readers expect, it is practicing unethical gatekeeping.
Imagine The New York Times conducting the same experiment, fiddling with whether readers’ news feeds are filled with stories about more death in Gaza or happy insurance customers. Facebook did that with its content.
If you’re going to use words such as “news” and “stories,” and if you’re going to decide what stories people see, then you’re in the journalism business. And that brings the ethical responsibility of transparency.