Should we care that Facebook is manipulating us?

by on June 29, 2014 at 12:08 pm in Music, Religion, Science, The Arts, Uncategorized | Permalink

Facebook manipulated the emotions of hundreds of thousands of its users, and found that they would pass on happy or sad emotions, it has said. The experiment, for which researchers did not gain specific consent, has provoked criticism from users with privacy and ethical concerns.

For one week in 2012, Facebook skewed nearly 700,000 users’ news feeds to either be happier or sadder than normal. The experiment found that after the experiment was over users’ tended to post positive or negative comments according to the skew that was given to their newsfeed.

The research has provoked distress because of the manipulation involved.

Clearly plenty of ads try to manipulative us with positive emotions, and without telling us.  There are also plenty of sad songs, or for that matter sad movies and sad advertisements, again running an agenda for their own manipulative purposes.  Is the problem with Facebook its market power?  Or is the the sheer and unavoidable transparency of the notion that Facebook is inducing us to pass along similar emotions to our network of contacts, thus making us manipulators too, and in a way which is hard to us to avoid thinking about?  What would Robin Hanson say?

Note by the way that “The effect the study documents is very small, as little as one-tenth of a percent of an observed change.”  How much that eventually dwindles, explodes, or dampens out in the longer run I would say is still not known to us.  My intuition however is that we see a lot of longer-run dampening and also intertemporal substitution of emotions, meaning this is pretty close to a non-event.

The initial link is here.  The underlying study is here.  Other readings on the topic are here.

I hope you’re not too sad about this post [smiley face]!

Marshall Kirkpatrick June 29, 2014 at 12:20 pm

Tyler, I am disappointed to read this. Informed consent in human experiments has been an important principle since the Tuskegee experiment was exposed. It’s about dehumanization and abuse of power. According to Cornell’s write-up, the study was funded in part by the US Army. There are salient historical precedents here beyond Buzzfeed, A/B testing and heart string tugging Halmark ads. The fact that the impact was quantitatively small isn’t really important when it comes to the principle of the thing. Imagine a less alienated scenario, where the Facebook employees behind this looked at the single moms, the terminally ill people, the struggling immigrants who were all undoubtedly among the random set of subjects, and said “let’s turn the dial down on that person’s perception of happiness among their friends.” I think that’s pretty sociopathic and a violation of a lot of heard-fought ethical standards.

Claudia June 29, 2014 at 12:43 pm

I agree and I think the National Academy of Sciences should retract this study from their journal. Facebook may not have broken any laws but without informed consent this should not be published as a scientific study … otherwise it encourages other veiled experiments for ‘science.’

hamilton June 29, 2014 at 2:27 pm

No, they shouldn’t. Field experiments–experiments in which the subjects do not and will not know that they are participating in the experiment–happen all the time. One of the questions for whether informed consent is necessary is whether the behavior is already publicly observable. Another is whether it exposes the subjects to risks greater than they would normally face in everyday life. In this case, what you post on Facebook is quite public, and you are regularly experience net positive or net negative affect posts on social network sites. Informed consent–indeed, institutional review of human subjects research–has gone way too far into the social sciences, and should be rolled back significantly.

Claudia June 29, 2014 at 2:48 pm

I disagree, but I can see there are gray areas here. It would not have been tough for Facebook to send out a message to those selected for the study: “We would like to include you in an experiment over the next few months. You are unlikely to notice much difference in your usual experience here. Click ‘like’ if you agree to participate.” Maybe they could just measure the affect of users after receiving that message versus those who did not?

Claudia June 29, 2014 at 3:01 pm

Oh and to be a little more supportive of the science … Facebook could have carried out all kinds of natural field experiments with their normal data. There are enough random news events of positive/negative emotion on local, national, and global scales that they would have easily had enough random variation to test their hypothesis. In fact, I thought this was pretty well established that our moods are affected by the emotional content we are exposed to.

prior_approval June 29, 2014 at 1:06 pm

‘Is the problem with Facebook its market power?’

It is called the Declaration of Helsinki, and is intended to prevent exactly what Facebook did, by codifying a set of human research ethics, including the essential role of informed consent. http://en.wikipedia.org/wiki/Declaration_of_Helsinki

By why would anyone honestly expect a man who has called the Nazis a bump in the road to accepting the bright promise of eugenics to actually care about human research ethics?

‘I hope you’re not too sad about this post’

No, it provides the typical level of expected amusement at the best satire site on the web.

gwern June 29, 2014 at 1:16 pm

> Informed consent in human experiments has been an important principle since the Tuskegee experiment was exposed. It’s about dehumanization and abuse of power.

So why invent this concept of ‘informed consent’ at all? (which, like the number of angels dancing on the heads of a pin, is so ill-defined and vague that it has employed generations of bioethicists – quick, when you sign up for a medical trial, exactly how much of the biochemical pathways do you need to know for your consent to be ‘informed’?)

Why not simply focus on the harms done to participants as balanced against the benefits of the experiment? This immediately condemns the Tuskegee & Nazi experiments: the results were largely worthless, and where they weren’t worthless (eg freezing airmen) their value was minimal compared to the harm done to obtain them, hence, they were evil.

Marie June 30, 2014 at 8:49 am

Because my judgment of the balance of costs and benefits of an act someone conducts towards me is going to be different from my judgment of the balance of costs and benefits of same act when I conduct it towards someone else.

Nicholas Marsh June 29, 2014 at 2:51 pm

I don’t think so. In general it is completely acceptable to use without informed consent anonymized data drawn from existing sources – ie that none of the data directly analyzed by the researcher could be linked to an individual, and the data was not created for the experiment. If you think about it for a second, a very large proportion of the social sciences rely upon such anonymized data (economists don’t need to get consent when using anonymized data on things like personal wealth, and large chunks of economics would cease if they had to).

Users of Facebook had already passed on their data to Facebook, and this link makes it clear that the data analysed was not linked to Facebook profiles (eg it was anonymized). http://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/

In addition, the above link states that the study had been approved by the researchers’ local institutional review board. As I would expect given that its using anonymized data which Facebook users had already given Facebook permission to use.

Claudia June 29, 2014 at 3:42 pm

This is not administrative data … Facebook ran an experiment.

Nicholas Marsh June 29, 2014 at 5:06 pm

Yes, as have very many social scientists before them. When studying people’s behaviour there are huge problems with experimenter effects. At the very least people become self conscious when they know they are being observed. So some social science research involves not telling the subjects so their ‘natural’ behaviour can be observed. Some of the classic experiments in psychology have used ‘undisclosed observation’ in which the subjects were not aware that they were part of an experiment. Others have involved researchers observing behavior in natural environments (ie not in lab conditions) where it would defeat the whole object of the experiment to inform the participants. Obviously there are ethical issues and a clear need for safeguards (concerning privacy among many others) which is why such experiments would need to be considered by an ethical review board before they could take place, which is what the Facebook study did.

Claudia June 29, 2014 at 5:31 pm

This is pretty lame IRB reasoning: http://m.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/ I understand that FB can do this per their terms of use. I find it troubling to have it presented as a scientific study.

derek June 29, 2014 at 8:09 pm

Why is that troubling? They are marketing their wares, and science as a selling point is quite effective, especially when they are trying to sell a large base of users that could be manipulated.

Steve Sailer June 29, 2014 at 6:53 pm

Marketing researchers have been attempting to manipulate people in studies for a hundred years. E.g., you put billboards up for Brand X on the East Side of town but not on the West Side, then try to measure whether sales go up more on the East Side.

Steve Sailer June 29, 2014 at 6:57 pm

I’ve noticed that in all the Malcolm Gladwellish hoopla over “priming” and concerns over Big Data as if they are anything new, that people tend to ignore the existence of the trillion dollar marketing industry and multi-billion dollar marketing research industry.

Marketers are always trying to manipulate you and they hire marketing researchers to conduct studies of whether their manipulations work.

Dan Weber June 30, 2014 at 8:54 am

Informed consent is important, but it’s not a complete requirement. There are lots of important studies that simply cannot happen with informed consent. For example, studying how people react to a fire drill is something that cannot be done via informed consent, and it generates knowledge that is critical for actually getting people out of a building during a fire drill. You tell people they are there to study something else, and then a fire drill go off.

You still need an outside party providing guidance on this, so an IRB would need to sign off.

I don’t know if this study passes those thresholds or if it had an IRB.

Nathan W July 1, 2014 at 8:09 pm

It’s not the results per se, but the method, to find fault in. These methods are not allowed. Period.

The direction of the results also confirm a hypothesis which essentially demonstrates that the reasons to require ethical review of such methods are in fact relevant.

Please say three times, “I am not a lab rat, I am not a lab rat, I am not a lab rat”, then “and neither are you”.

This is the golden rule in social sciences research. You can observe all you want, but you have to start telling people when you are playing God by first manipulating then observing the result.

I am not a lab rat. And if Facebook wants to suggest that it’s OK to treat me like one, then I certainly hope I am not alone in saying “Where’s the next Facebook? One we can maaaaaybe trust a little.”

Paid subjects cost money. If you want lab rats, you must tell them what’s happening and then pay them for their time.

This means that many interesting things will never be studied directly. We will know less about ourselves as a result.

But the ultimate outcome? I am not a lab rat.

A B June 29, 2014 at 12:20 pm

People in the entertainment industry have been inadvertently or deliberately manipulating us for decades. Anyone with a teenage kid has an opportunity to see this firsthand.

Facebook is no different in that regard, except that it reaches a billion people and is being run by a 30 something with a rather narrow (though glorious) life experience and not a great history of self-restraint.

Rahul June 29, 2014 at 12:26 pm

It’s all about stated / implied goals: with ads or songs or movies people expect their emotions to be manipulated. With something like Facebook they don’t. Hence the outrage. Legitimate I think.

e.g. Imagine AT&T started parsing text messages & intentionally dropping 50% of those that had the word “Obama” in them. Or AP or Reuters issued a memo to their newsdesk to suppress positive stories involving Muslims.

When people are manipulated by services they don’t expect it from, they resent it.

Dave June 29, 2014 at 7:31 pm

AT&T is a regulated common carrier, and so is prohibited by law from what you are suggesting. Facebook, AP and Reuters are both content providers and editors/aggregators, and moreover get quite literally all of their money directly or indirectly from advertising. Of course they are in the business of filtering content. If the content wasn’t filtered, it would be unreadable. It’s kind of hard for me to believe there are substantial portions of the adult population who don’t get this.

Rahul June 30, 2014 at 12:36 am

And you don’t get there’s a difference from one filter to another? If a firm filters Resumes based on qualification no one cares but try race & you’ll get a different outcome.

Nathan W July 1, 2014 at 8:13 pm

Media provider wants to filter content so I get the content that I want to read.

Good job. Thank you. Saves my time filtering through stuff I do’t want to read myself. This is how it’s supposed to work. They reach me with stuff more attuned to what I want, and as a result can target me with products that I am more likely to buy (perhaps including their own content down the road).

This is not the same situation as a media provider providing differentiated content to different users, NOT to see what else they clicked after (this is common to use different messages to see who clicks on what after), but to systematically parse their words to see how they could emotionally affect the individuals.

Please, I think you have a sufficiently creative mind to imagine innumerable ways in which tolerance to this sort of stuff could get way out of hand and play a role in nightmare ending scenarios.

There’s a line. Facebook was on the wrong side of it. Barely. They need to be slapped hard enough to remember.

JCKCA June 29, 2014 at 12:34 pm

I think it is generally not considered an ethical breach unless you diagnose someone without their consent or the published data can be somehow used to figure out the reaction and identity of an individual. So if the study concludes that 18 of the users were psychotic based on the analysis of the results, that is inappropriate. But if it is limited to the comments seemed sadder, then that is okay. I’m not sure how the Facebook experiment differs from the Obama campaign fund raising as described here: http://www.businessweek.com/articles/2012-11-29/the-science-behind-those-obama-campaign-e-mails other than Facebook published the results.

Nathan W July 1, 2014 at 8:15 pm

Why don’t you go to the website of the university you attended and try to find what you need to do to even START to get ethical approval for such a study.

Humans are not research subjects. Yes, I know, for all manner of marketing things this is not practically all that true. But a) it can be abused, and b) it bothers me, a lot.

Jon M June 29, 2014 at 12:42 pm

What I don’t understand is why everyone is up in arms about this study bit bit the previous one they did encouraging people to vote http://www.nature.com/nature/journal/v489/n7415/abs/nature11421.html

Kabal June 29, 2014 at 12:49 pm

If people’s emotions are this sensitive and/or easily manipulated, then we should thank Facebook for the confirmation instead of shooting the messenger.

derek June 29, 2014 at 12:55 pm

Good point. Nothing like the old two by four across the head to begin a conversation.

Nathan W July 1, 2014 at 8:16 pm

The results basically confirm the rationale for not permitting those studies.

As a saving grace, they could try to convince the public that maybe it’s what they were getting at in the first place.

They will go as far as we let them.

Kelly June 29, 2014 at 12:52 pm

This reminds me of the icky feeling I get when I read articles advocating ways the people can be “nudged” to improve their lives…

Nathan W July 1, 2014 at 8:19 pm

Compare A to B.

A. Default status: You’re a donor as soon as you turn 18. You can opt out any time you want.

B. Default status. You must opt in to be a donor.

Which system results in more donors? It is a rare case to praise Israel for policy entrepreneurship. And a rare positive manifestation of their fairly legitimate paranoia regarding their poor relations with neighbours.

Are there ickier examples of “nudging” than the above? Or do I understand it incorrectly.

derek June 29, 2014 at 12:54 pm

In one of my iterations I did sales of big ticket items. What surprised me was how open to manipulation people would put themselves; it was almost like they wanted to be manipulated into convincing them that it was a good idea. I purposely would manipulate them back to reality; I needed to get paid in the end, and by that time the feeling would have worn off, so I would purposely say something cautionary or slightly negative, watch the sudden fall back to earth, then proceed on a more rational decision path. Worked very well; I got the sales I needed and the money at the end.

I wonder if the anger is about people realizing that they were made to be the fool. Maybe a bit of ruefulness that this is what Hope and Change actually looks like.

derek June 29, 2014 at 12:58 pm

It is one thing to be manipulated into buying a sporty model of car instead of the minivan. Or to get fries with your burger. At least you get something, even if you laugh at yourself for falling for the old trick. After all that manipulation what did people get except humiliation at being fooled?

Dave June 29, 2014 at 1:15 pm

If you go to the front page of Amazon.com right now, the actual contents and layout of the page that you see is the result of literally dozens of simultaneous A/B experiments. The intent is to figure out just how to optimize your purchases from Amazon. Many of these experiments will take your entire history of clicking on the Amazon site and any affiliated sites into account. I don’t seem how this is any different, nor do I find any of it particularly sinister.

Willitts June 29, 2014 at 10:03 pm

The difference being that if I buy something from Amazon, there is a mutually beneficial trade going on. If they target me with effective ads, then I am better off. Amazon isnt manipulating how I feel about life, the universe, and everything.

Dave June 29, 2014 at 10:47 pm

Speaking as someone in the trade, Amazon would be more than happy to manipulate you on that level if it could result in a +0.5% CTR. I don’t know that they are, but it wouldn’t surprise me even slightly. The most amazing thing (I find) is that it’s actually possible that they are manipulating you on that level without even knowing that they are. Correlations found using machine learning algorithms don’t necessarily come with explanation, and are in any case devoid of anything like intent or volition, even if you hook them up for direct feedback to the user experience.

Also, I would note that if you are a Facebook user (as I am), you presumably think their product has some value, even if you don’t pay for it. Just how did you think they made their money, if not from you?

Willitts June 29, 2014 at 11:00 pm

FB makes money because of the sheer volume of its users. I do use it and they have never earned a cent from me, but they may have earned a cent from someone else I know.

Again, we arent talking about Amazon affecting my general feelings when they entice me to click through. If I click through and dont buy, there is a winner and loser in their contract with someone else. Enough losing and the counterparties will disappear.

Dave June 29, 2014 at 11:50 pm

You’ll note that on the right hand side of the Facebook page there are advertisements. Even if you never click on them, I guarantee Facebook has earned money from your actions (if not directly from your pocket). They can increase that money by making you stay on the site longer, more often, or generally feel better about it. Facebook (and Amazon, and Google, and, and, and) run experiments (literally hundreds per day) in order to optimize the amount of money they make from you. That’s all that is occurring in this story. I have absolutely no idea what possible distinction people are making between this case and standard marketing activities that have been occurring since just about the time we as a species learned what this whole “statistics” thing was.

Nathan W July 1, 2014 at 8:25 pm

Amazon doesn’t provide me with newsfeeds, and I don’t tell Amazon how I feel.

But if this changes, then I will be equally at arms.

Say you have a blog. Like this one. Perhaps the bloggers want to understand their audience better. So one day they write something which they think will upset many of their readers, but maybe they’ll find out that 20% respond positively. And then they fine tune offering, or perhaps just become more interested in some issues. This seems reasonable to me. The goal is to better understand the audience in order to attract readers, develop ideas, etc.

But Facebook wasn’t trying to figure out what content people wanted. They wanted to know if they could manipulate my mood, and to find out, the manipulated people’s moods without telling them, then observed their personal writings to see if they could observe a difference. Wrong side of the line.

Andrew' June 29, 2014 at 1:18 pm

Please tell me this study isn’t stupid. If I get a sad news story, I’m going to make a “sad” comment. It doesn’t mean anything other than that is the news story I just read.

derek June 29, 2014 at 1:31 pm

If I manipulate the news that people hear, I can manipulate how they feel about things. Not really rocket science.

What Facebook is selling isn’t a feeling to the people who are members. They are selling the ability to manipulate their members.

It is an interesting evolution in the tech world. Microsoft became dominant by having their platform available everywhere and everything that anyone wanted to do could be done on their platform, and many times only on their platform. The experience was largely undefined by Microsoft other than you could have one. They have lost the market leader status to someone who controls the experience. Facebook seems to be experimenting with controlling the experience. Apple sells boxes and phones and tablets and their strategy sells more. Facebook sells an audience, and a controlled audience is vastly more valuable to them.

Nathan W July 1, 2014 at 8:31 pm

Imagine if you sold books and pills for people with depression.

You approach Facebook and ask them to modify newsfeeds to people having posted the word “depression” in any post in the last 4 weeks. You want them to be just a little more depressed, so please make the newfeeds just a little more depressing. Then you offer them the solution. Given the other 200 things you know about this person from Facebook information, you target them with the “100 Ways to Love Yourself”, where they can buy clothing online at a website specially designed for the (presumably) depressed visitor to the website. The business, of course, couldn’t care less about helping the person, but if they can tweak that depression just right and double annual sales, then perhaps it’s worth spending a few hundred grand on a Facebook campaign to help depressed girls hit rock bottom just in time to be saved by a clothing website which tells them how amazing they are. They could have started with the positive approach, but if these kinds of studies show they can manipulate people in just that kind of way …

Here’s a depressing thought: That’s our future if we don’t stand up against this. Wrong side of the line.

Jan June 29, 2014 at 1:29 pm

Clearly the risks outweigh the benefits.

Sergey Kurdakov June 29, 2014 at 2:01 pm

I see, that there is another aspect of this research

that people are manipulated by events. Because if there are many sad world events someone becomes more sad for no other reason than being interested in ‘world news’. But hardly anyone bothers, if only for the reason that no one directly responsible, that medium makes so much manipulations. But the harm from ‘freely’ acting medium could be much more, than the harm from research which revealed some facts.
that said I think the research is very useful so people could keep them alert about manipulations not only from people but from medium itself.

Jar Jar Binks June 29, 2014 at 2:12 pm

Fuck the IRB

Donald A. Coffin June 29, 2014 at 2:55 pm

Note as well that FB does not automatically include in what one sees posts by people who are one’s “friends.” They have some sort of an algorithm that I don’t quite get (why would I not want to see a post by one of my friends?). So it’s not just ads–it’s what people have poste with the intention that their friends see it. The word for this is “censorship.”

And I’m sure FB would respond to the “informed consent” issue by saying that he have all accepted their “terms of service.” Well, that’s not what my university’s IRB would have allowed in any research involving human subjects. And it’s not what most of us would probably interpret the “terms of service” to say…or even to imply.

Andreas Baumann June 29, 2014 at 6:37 pm

In your feed settings, you can switch to “most recent” instead of “top stories”. That’ll give you all content from your friends.

prognostication June 29, 2014 at 7:13 pm

I don’t think this is quite true. You definitely see more with Most Recent than Top Stories, but some people’s content still never shows up in my feed.

Donald A. Coffin June 30, 2014 at 1:52 pm

No, it wont. I’m always set to “most recent,” and I know things don’t always appear (even some of my own postings don’t show up).

Dave June 29, 2014 at 6:45 pm

>why would I not want to see a post by one of my friends?

Presumably because you’ve got too many people you’re facebook friends with, posting more stuff overall than you want to read. To fix that problem, Facebook started automated filtering and prioritization of feeds a few years back, greatly improving usability. They base it on the posts that you actually spend time reading or responding to. You can turn it off if you want, but I’m guessing if you’re like most you’ll find the results not to your liking. Calling it censorship is extremely silly. If nothing else, it’s content _on_ Facebook, so I’m not sure how Facebook could be said to be censoring it.

As to their terms of service, what would the part about “research” possibly have meant, if not something like this. The only thing even remotely newsworthy about something like this is that they published in a scientific journal (I’m honestly surprised they bothered). Pretty much every web-based service has been doing experiments like this for years. Indeed it’s pretty much the entirety of many of their business models.

dead serious June 30, 2014 at 9:31 am

“…greatly improving usability…”

Debatable.

Dave June 30, 2014 at 10:58 am

I have absolutely no doubt that Facebook has usability metrics for the difference, out to three decimal places and with a lot better p-values than most published science.

Donald A. Coffin June 30, 2014 at 1:53 pm

Um…I have 43 friends. That’s “too many”?

Nathan W July 1, 2014 at 8:34 pm

You would have to be able to find a way to contact them first.

I’ve been trying to close my account for years, with the sole outcome of a number of friendly jabs regarding inability to correctly process the sequences also used to access the site in the first place.

Bill June 29, 2014 at 3:20 pm

I am deeply saddened and increasingly depressed from this post.

I agree with the comments about IRB, with a caveat.

Ask how you would deal with the following issue: Marketing is always manipulation. So, if a marketing faculty advises a company on how to improve its advertising program, or recommends tests of various programs in different markets, would this arise to the level of requiring informed consent of the target audiences? I doubt it.

I’ll end this comment with a happy face. But I don’t know how to make the emoticon.
In advertising and marketing, everyone is a human subject experiment.

Marie June 30, 2014 at 10:15 am

Actually, there is advertising that informs about a product and advertising that intends to persuade.
We’re just very used to the second kind being ubiquitous.

Nathan W July 1, 2014 at 8:35 pm

You propose to take a bunch of people and give them different treatments then see how they respond differently.

How did you make the observation? How did you make the treatment?

Are they treated as a research subject? If so, and you cannot observe their actions without manipulating them in the first place, then you must pay them.

bxg June 29, 2014 at 5:00 pm

The paper accepts the principle of informed consent but say

“As such, it was consistent with Facebook’s Data Use
Policy, to which all users agree prior to creating an account on
Facebook, constituting informed consent for this research”

The authors could have honestly disagreed that informed consent was necessary for
such a small manipulation (and it seems many commentors above concur) but that’s
not what they did. They claimed to have it, but just flat out lied. Or didn’t lie, but claimed it only by
redefining it into something unrecognizable according to even the most permissive
research standards. THIS, not the research itself, is a true sin.

S June 29, 2014 at 5:22 pm

If Facebook is manipulating you, you should care. Specifically, you should be embarrassed.

Willitts June 29, 2014 at 9:59 pm

It’s pretty haughty to believe yourself beyond the reach of emotional manipulation.

Marie June 30, 2014 at 9:33 am

I signed up and then almost immediately cancelled, I get a lot of email spam from them and I see the subject line without opening them, and that does manipulate me. One line every day or two days. It genuinely affects me, I think largely in making me feel like I have a wide circle of people that know me and that I know, which is a content-less (and therefore not subject to scrutiny and correction) esteem-builder. I don’t like the emails and I don’t want to sign up for Facebook, but it has not occurred to me until now to just block them with my spam filter. Why not? Is it because I kind of like that daily reminder that I’m not friendless?

I can’t imagine how manipulated I’d be if I actually used the thing.

Nathan W July 1, 2014 at 8:37 pm

I disagree. You should be embarrassed if you are surprised. Unless the surprise is that they weren’t up to ten times worse.

Tip of the iceberg? Methinks so. But no one likes a witch hunt.

ummm June 29, 2014 at 5:57 pm

How weak must your constitution be to have your emotions manipulated by a subtle Facebook update.Must have been liberals.

Andreas Moser June 30, 2014 at 7:21 am

Your comment made me sad. :-(

Affe June 29, 2014 at 6:37 pm

What if there was no initial manipulation, simply the release of report about a manipulation that never occurred ? Until now ? Is THAT manipulation ?

(takes another hit)

MIND. BLOWN.

Willitts June 29, 2014 at 9:57 pm

Great question.

Claudia June 29, 2014 at 6:41 pm

This is a good post on both sides of this issue: http://thomasleeper.com/2014/06/facebook-ethics/ Our experiment in Big Data continues …

mulp June 29, 2014 at 6:49 pm

Unless facebook can manipulate me by mental telepathy, it can’t manipulate me, as I am one of the half billion with a facebook account that is ignored because facebook has zero value we can find.

Even free is too expensive.

derek June 29, 2014 at 8:11 pm

Finally I have found common ground with mulp.

Klypherd klaighton June 29, 2014 at 8:59 pm

yet one more reason to not use Facebook.

The Other Jim June 29, 2014 at 9:17 pm

If it were not possible to significantly affect peoples’ moods by delivering them good or bad news, let’s say based on something like which party controls the White House, then (a) the New York Times would not be in business and (b) we would have a different President.

Steve Sailer June 29, 2014 at 9:55 pm

If it were possible to manipulate the mood of Americans via the media, then Carlos Slim would be a major owner of the New York TImes.

Willitts June 29, 2014 at 9:55 pm

Lots of good comments here. I can tell they are good because I changed my opinion five times while scrolling down the thread.

My net conclusion is that Facebook can likely legally do this, but it is nonetheless disturbing. Yes, advertisers do play on our emotions, but they do so for the specific purpose of selling their products. We know at a conscious level that our emotions are being manipulated. It is above board.

Here, the intended effect is a generalized emotional state for no specific purpose related to commerce. The experiment is underground.

The implication is that this can be used for propaganda purposes. This could affect the outcome for incumbent political candidates.

To a large extent, Hollywood already does this by flooding the airwaves with their memes. Again, this is transparent.

When Huffington Post selects its stories, again, we know which side on which HuffPo’s bread is buttered. Sites like Upworthy are less transparent shills. FB has taken manipulation to a level that would have impressed George Orwell. The one merit in FBs favor is that they made the results public.

Even if we passed a law against this, the Supreme Court would likely strike it down – and they should. What might pass and could withstand judicial scrutiny is a sunshine act.

Tununak June 29, 2014 at 10:15 pm

Facebook users will be unhappy about this because they perceive Facebook as a site that lets them share their news with their friends at no cost. It’s been a benevolent, nonthreatening Big Mommy. Now Mommy has manipulated them on a whim.

There’s no free lunch, but there also is no benevolent Big Mommy, or Big Daddy Government for that matter. It just takes some people longer to learn that.

Willitts June 29, 2014 at 11:04 pm

No. Both FB and I know the value of our mutual exchange and we are both satisfied. I just don’t appreciate when Mark Zuckerberg throws a rock at my car window when he knows I’m likely sleeping.

Darren Johnson June 30, 2014 at 4:02 am

Non-event unless it wasn’t published for research.

Andreas Moser June 30, 2014 at 7:20 am

If you have been receiving my Facebook posts, you were in the sample group for smart posts.

Marie June 30, 2014 at 9:44 am

Yes, Facebook is Ebola scary and Hitler evil, no news there.

How many people cancel after stories like these? It’s not an indispensable product, and there’s no gun to heads.

I don’t think there’s any answer to this stuff, Orwell certainly was right but I don’t think he anticipated the level at which our submission to propaganda starts off as voluntary. The bit of libertarian in me is horrified at the idea of regulating or banning Facebook, but just opting out as an individual doesn’t solve the problem, you still have to live in the world with the zombies. I’m amazed by the number of businesses and organizations that use Facebook as their information platform, and setting it up so you can’t access it without signing in. Why would I want to stop people from reading about my product? But it seems they only want certain kinds of people reading about their products, I guess the kind of people that don’t have issues with Facebook telling them what to do or, in this case, how to feel. I guess that can be handy, if you can get to a tipping point where even folks like me can’t get through their regular business without signing up. Crowd forcing.

I guess the free market solution is that some alternative comes in to disrupt this. Why hasn’t one? Why does a country of 300 + million so often all choose one product, or tops two?

Urso June 30, 2014 at 10:39 am

Is this appreciably different from companies who regularly A/B their websites and study how the viewers respond differently to different versions? If so, why?

See: http://www.wired.com/2012/04/ff_abtesting/

“Without being told, a fraction of users are diverted to a slightly different version of a given web page and their behavior compared against the mass of users on the standard site. If the new version proves superior—gaining more clicks, longer visits, more purchases—it will displace the original; if the new version is inferior, it’s quietly phased out without most users ever seeing it.”

Nathan W July 1, 2014 at 8:39 pm

What do you want to bet that the people who designed the study were more interested in subliminal effects (who really reads the Facebook newsfeed that closely?) than whether they could manipulate mood in the first place.

I’m pretty creative, so I don’t want to share my thoughts, but the possibility for abuse abound.

tk421 July 6, 2014 at 5:46 am

Facebook is for idiots.. so, no why should we care. Religion, government any so called authority will and does manipulate for their agenda. We are all fair game. Thats the contract . Why do we write fiction? why do we listen music? we are all caught in the trap.

Comments on this entry are closed.

Previous post:

Next post: