You are under the control of Facebook and it will control you and your “friends” Now proven with Its Emotion Manipulation Study

By | June 30, 2014

As the Main Stream Media has less influence on some people because they are now addictive to sites like Facebook the level of influence and depth of reach of information control needed to be quantified if the population is to me kept under control.  In an experiment facebook conducted earlier this year, Facebook injected the feeds of nearly 700,000 of its (unknowing) users with negative content to see if it would make the posts they wrote more negative.

The researchers believe that it did. The mood of the posts seen in the news feeds of the experiment’s subjects moved like a “contagion” into the posts of said subjects.

The inverse, too, was true, the researchers say.

It’s not surprising that Facebook would mess with the moods of its users, who — allow me to remind you — are its bread and butter. Exposing users to the advertisements of Facebook’s partners — on and off Facebook –  is the social media giant’s only real business.

It’s not even more surprising the company would publish the results. The paper appeared in the Proceedings of the National Academy of Sciences (PNAS). Because Facebook users are so easily manipulated that all they would have to do is to tell them it was all ok and they actually enjoyed being manipulated. like one big hypnotized stage show.

From the paper:

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

But the PNAS has doubts about the validity of the research outcome. It’s unclear, the PNAS says, whether the negative and positive posts from research subjects were caused only by the manipulation of the news feed, and not by negative interactions with other users. yet they still published the research and outcome.

Facebook data scientist Adam Kramer led the research. Here’s what it says at Kramer’s American Psychological Association page: “D.I. Kramer, PhD, has an enviable subject pool: the world’s roughly 500 million [now well more than a billion] Facebook users.”

The other researchers were Cornell University professor Jeff Hancock and UCSF post-doctoral fellow Jamie Guillory.

The paper says that users provided tacit consent to be used in research studies when they signed up for Facebook and agreed to Facebook’s Data Use Policy.

So no legal exposure for Facebook, but definitely some more bad vibes from a company that has demonstrated over and over that the needs of its advertising business always trump the needs of its users.

This experiment takes Facebook’s disregard to another level, as it actively sought to impact the wellbeing of users. what if someone was pushed over the edge into suicide because the world was just falling apart and they did not want to belong anymore?

Facebook already has plenty of ways to make people unhappy, from its humblebrags to its envy-inducing profiles.

Last year a University of Michigan study told us that Facebook makes many young people depressed.

Another study, published by Berlin’s Humboldt University, reported that Facebook often fills users with feelings of envy.

Venture beat asks: “And what is the point of this research? Why is it being conducted? Is it purely an academic exercise, or could it be used by some unscrupulous party to mess with people’s feeds and moods on a regular basis?”

Is this really a question?

Facebook should immediately disclose to each of the Facebook users whose feeds it manipulated what they were subjected to and when .

Facebook Doesn’t Understand The Fuss About Its Emotion Manipulation Study.

This weekend, the Internet discovered a study published earlier this month in an academic journal that recounted how a Facebook data scientist, along with two university researchers,

turned 689,003 users’ News Feeds positive or negative to see if it would elate or depress them.

Not unlike the Bildberger groups members turning the Main Stream Media positive or negative, pro-gun, or anti gun, Terrorist are after you civil rights stealing lies, Wake the fuck up you sheeple.

The purpose was to find out if emotions are “contagious” on social networks. (They are, apparently.)

“to find out”  how about to prove once again and to see how fast and of what level of influence you sheeple can be manipulated.

The justification for subjecting unsuspecting users to the psychological mind game was that everyone who signs up for Facebook agrees to the site’s “Data Use Policy,” which has a little line about how your information could be used for “research.” Some people are pretty blase about the study, their reaction along the lines of, “Dude. Facebook and advertisers manipulate us all the time. NBD.” Others, especially in the academic environment, are horrified that Facebook thinks that the little clause in the 9,045-word ToS counts as “informed consent” from a user to take part in a psychological experiment, and that an ethics board reportedly gave that interpretation a thumbs up. The larger debate is about what companies can do to their users without asking them first or telling them about it after.

I asked Facebook yesterday what the review process was for conducting the study in January 2012, and its response reads a bit tone deaf. The focus is on whether the data use was appropriate rather than on the ethics of emotionally manipulating users to have a crappy day for science. That may be because Facebook was responding to a privacy reporter:

“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” says a Facebook spokesperson. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Remember reading this part of Facebook's data use policy?

Remember reading this part of Facebook’s data use policy?

It’s particularly fascinating to me that Facebook puts this in the “research to improve our services” category, as opposed to “research for academic purposes” category. One usable takeaway in the study was that taking all emotional content out of a person’s feed caused a “withdrawal effect.” Thus Facebook now knows it should subject you to emotional steroids to keep you coming back. It makes me wonder what other kind of psychological manipulation users are subjected to that they never learn about because it isn’t published in an academic journal. This gives more fodder to academic Ryan Calo who has argued that companies need to get their psychological studies of users vetted in some way that echoes what happens in the academic context. When universities conduct studies on people, they have to run them by an ethics board first to get approval — ethics boards that were mandated by the government in the 1970s because scientists were getting too creepy in their experiments, getting subjects to think they were shocking someone to death in order to study obedience, for example. Interestingly, the Facebook “emotional contagion” project had funding from the government — the Army Research Office — according to a Cornell profile of one of the academic researchers involved. And the professor who edited the article said the study was okayed by an Institutional Review Board. That approval has led most academic commentators’ jaws to hit the floor.

Before this story broke, Betsy Haibel wrote a relevant post that linguistically elevated the stakes by calling companies’ assumption of consent from users as corporate rape culture. “The tech industry does not believe that the enthusiastic consent of its users is necessary,” wrote Haibel. “The tech industry doesn’t even believe in requiring affirmative consent.”

When I signed up for 23andMe — a genetic testing service — it asked if I was willing to be part of “23andWe,” which would allow my genetic material to be part of research studies. I had to affirmatively check a box to say I was okay with that. As I suggested when I wrote about this yesterday, I think Facebook should have something similar. While many users may already expect and be willing to have their behavior studied — and while that may be warranted with “research” being one of the 9,045 words in the data use policy — they don’t expect that Facebook will actively manipulate their environment in order to see how they react. That’s a new level of experimentation, turning Facebook from a fishbowl into a petri dish, and it’s why people are flipping out about this.