Life

Facebook Is Way Creepier Than We Thought

by Emma Cueto

If you're anything like me, you've already checked Facebook several times today – how else are you going to know what your friends from elementary school are up to or what your cousin's kids look like? But even though we all know that Facebook tries to manipulate us into spending as much time as possible on the site, it turns out they may be trying to manipulate our moods without even telling us. And if that sounds creepy, it's because it is.

Facebook just published a study in the prestigious Proceedings of the National Academy of Sciences revealing that for one week in 2012, they deliberately monkeyed with the newsfeeds of almost 700,000 users. Some people's newsfeeds were altered so that they were more likely to see posts containing positive, happy words, while others' feeds showed more posts with negative or sad words. At the end of the week, the Facebook team found, the people who saw happy posts were more likely to post happy posts, while the people who saw sad posts were more likely to post sad posts.

From this, Facebook concludes in their study that emotional states can be transferred via social media pages. On the other hand, the rest of the world concludes from this that Facebook is a massively creeptastic Internet empire looking to engage in thought control that probably violated ethical standards for the use of human subjects in experiments.

(Also, it's worth pointing out that even though Facebook is all proud of themselves that they managed to manipulate people's emotions, this study could just as easily be interpreted to mean people mimic their friends' behavior, instead of saying their emotional states were altered. But I can't even with that right now, because creepy.)

Facebook claims that this experiment was totally fine, due the fact that all Facebook users have signed their user agreement, which states that users’ data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” So legally, they're probably in the clear, but ethically, that's another story.

James Grimmelmann, a professor of technology and the law at the University of Maryland, told Slate: “If you are exposing people to something that causes changes in psychological status, that’s experimentation, ... this is the kind of thing that would require informed consent.” And signing a comprehensive user agreement years ago that you probably didn't read probably doesn't meet the standards of "informed consent" for a psychological experiment.

Then, of course, there's the ethical implications of trying to make someone sad on a whim. I'm sure that for most people the week didn't have long-term lasting impacts, but given how much we already know about how Facebook affects our emotional states, it's entirely possible that for people with a history of depression that week might actually have affected them in ways that didn't end when the experiment did.

And all of this is overlaid with the overall pure creepiness of Facebook treating its users like lab rats, feeling totally comfortable and entitled to use us all as human test subjects – and to use us, not in pursuit of some greater good, but in perfecting of Facebook's ability to further manipulate us. That is, quite simply, not okay. It's not okay at all.

And the fact that we don't know if they've done this sort of things other times, or whether they're doing it now, that is even worse. Because if Facebook is willing to treat its users like lab rats once – if they're so comfortable with the idea that they'll submit a study admitting to it themselves – then they certainly are capable of doing it more than once. And with over one billion people using Facebook, the implications of that are highly unsettling.