Click even more sentences to ponder

by on January 2, 2018 at 10:31 am in Political Science, Web/Tech | Permalink

Facebook announced that it will no longer use “Disputed Flags” — red flags next to fake news articles — to identify fake news for users. Instead it will use related articles to give people more context about a story.

Why it’s happening: The tech giant is doing this in response to academic research it conducted that shows the flags don’t work, and they often have the reverse effect of making people want to click even more.

That is from Noah Berger at Axios.

1 derek January 2, 2018 at 10:36 am

It took ‘academic’ research to figure that out?

We are led by stupid people.

Reply

2 JWatts January 2, 2018 at 10:37 am

There is value in confirming the obvious.

Reply

3 dan1111 January 2, 2018 at 10:43 am

Well, duh.

Reply

4 msgkings January 2, 2018 at 11:51 am

Heh.

Reply

5 derek January 2, 2018 at 10:46 am

The obvious being ‘we are led by stupid people’?

There is an old old trick as old as printed news. Front page quotes from the ministers and priests all in agreeing that something is a threat to children, women and morality, leading to a full house at whatever evil event was being hosted. Journalists being largely disreputable scoundrels loved this as it poked the right people in the eye. You could say that this is what has characterized the media environment for the last year and a half.

These overeducated fools who know the alphabet of sexual identities haven’t the faintest clue of both the history of their occupation or human nature.

Reply

6 Pshrnk January 2, 2018 at 12:21 pm

People interpret a red flag as “this is important” rather than this is sketchy. That is a problem with how they flagged the fake news.

Reply

7 JK Brown January 2, 2018 at 12:44 pm

Next thing you know, they’ll confirm that “if it bleeds it leads”

Reply

8 Hadur January 2, 2018 at 10:40 am

This is why I get most of my news from news aggregation websites that show me 2-3 stories about every topic. Hell, on Google News, it shows you 2-3 stories and then you can click and see 1,000+ stories.

Reply

9 clockwork_prior January 2, 2018 at 10:51 am

And your filtering of the barely rewritten newswire stories making up most of that 1000 is undoubtedly top notch.

Reply

10 Willitts January 2, 2018 at 10:43 am

And their “related links” are just hand-picked leftist counterpropaganda.

It won’t be long before there is a first amendment challenge under a community access standard. FB should focus on its business of providing a forum and stay out of content.

Reply

11 ʕ•ᴥ•ʔ January 2, 2018 at 10:46 am

Spoken like a guy who knows he needs fake news to make his case.

It would be so much stronger to say “great, and I have some true things to share!”

Reply

12 TMC January 2, 2018 at 11:24 am

Spoken like a guy who knows if facebook says it’s false, it’s true but inconvenient for the democrats.

Reply

13 ʕ•ᴥ•ʔ January 2, 2018 at 12:04 pm

Sorry, buddy. More basic insecurity about what is true.

Try being more forthright. Tell us an underappreciated truth *and* link to a reputable source supporting it.

Like I do, when I say I believe in AGW and then link to NASA and NOAA.

Reply

14 Jan January 2, 2018 at 12:07 pm

There is a lot of marginally dishonest news around, which wasn’t affected by this. What Facebook was flagging was obviously false stuff, like “Obama returns to native Kenya to lead Jihadist coup.” It’s like National Enquirer crap in the grocery store checkout line that, unfortunately, a certain percentage of true dumbasses actually believed.

Reply

15 Willitts January 2, 2018 at 1:58 pm

Does National Enquirer crap have to be rebutted and counterpointed? Anyone with half a brain knows it’s false, and anyone who doesn’t know it’s false won’t be persuaded otherwise. And I don’t find the counterpropaganda to be any more true or informative.

The bottom line is that the internet has created unprecedented exposure to information, including misinformation. The antidote is honesty and openness, not suppression and substitution.

16 Potato January 2, 2018 at 5:22 pm

Was it the NE that brought down John Edwards?

Was it the Drudge Report on Lewinsky?

Actually asking, I have a faint memory of these two scandals being driven by bullshit sources.

If Facebook wants to put their finger on the scale, I say go ahead. As long as using Facebook is not mandatory I do not see the harm.

If Google starts interfering with information search-ability then it’s time for Big Brother to set some rules. We can call it net neutrality…

17 ʕ•ᴥ•ʔ January 2, 2018 at 5:33 pm

Google is trying to dig themselves out of a fake news problem.

http://www.foxnews.com/tech/2017/10/04/digital-disaster-facebook-google-carry-fake-news-on-las-vegas-shooting.html

The problem was that they were too sensitive to the “clickiest” news stories immediately after a disaster. This was an automated response to usage patterns, and not an editorial decision. Now they’ll have to figure out some discernment. Will someone claim that the “clickiest” is the unfiltered news?

18 clockwork_prior January 2, 2018 at 10:49 am

‘It won’t be long before there is a first amendment challenge under a community access standard.’

Actually, that will be a really long time, as Facebook has nothing to do with ‘community access standards.’ After all, it isn’t as if Facebook was punished for performing scientific experimentation on its members, without their awareness, much less consent – ‘Facebook let researchers adjust its users’ news feeds to manipulate their emotions – and has suggested that such experimentation is routine, which is seemingly how the idea got past the advertising firm’s ethics committees.

In 2012, researchers led by the company’s data scientist Adam Kramer, manipulated which posts from their friends the sample of nearly 700,000 users could see in their “News feed”, suppressing either positive or negative posts, to see whether either cheerful or downer posts seemed to be emotionally contagious.

With only a user’s “agree click” on Facebook’s terms and conditions document to provide a fig-leaf of consent to the creepy experiment, researchers from Cornell University, the University of California San Francisco manipulated users’ news feeds.

Let’s hear from the university:

“The researchers reduced the amount of either positive or negative stories that appeared in the news feed of 689,003 randomly selected Facebook users, and found that the so-called ’emotional contagion’ effect worked both ways.”

Cornell Social Media Lab professor Jeff Hancock reports the simple correlation that turned up: “People who had positive content experimentally reduced on their Facebook news feed, for one week, used more negative words in their status updates. When news feed negativity was reduced, the opposite pattern occurred: Significantly more positive words were used in peoples’ status updates.”’ https://www.theregister.co.uk/2014/06/29/researchers_mess_with_facebook_users_emotions/

Reply

19 Willitts January 2, 2018 at 2:02 pm

Facebook has everything to do with it.

There is legal precedent for protecting First Amendment rights from private actors when those actors have created a community forum. I can’t put my finger on the citation at the moment.

The point is that if FB gives people an open soapbox, it begins to lose control over who and what is presented. It has become public and not exclusively private speech. Someone will soon file suit over FB censorship and win.

Reply

20 ʕ•ᴥ•ʔ January 2, 2018 at 10:43 am

The most powerful and positive news here is that social media engineers have taken up the task of providing good news, and are iterating based on quantitative results.

Yes, the red flag bit is tragically ironic. As a footnote.

Reply

21 Sigivald January 2, 2018 at 3:10 pm

How do we trust them to not conflate “good” as in “solid” with “good” as in “politically convenient or agreeable to the corporate culture or some other standard”?

Reply

22 ʕ•ᴥ•ʔ January 2, 2018 at 4:05 pm

Open source methods?

As people have suggested, having secret algorithms acting on an unknown subset of accounts makes a very opaque picture.

Reply

23 Potato January 2, 2018 at 5:28 pm

We disagree on many things, but I had assumed you were a left leaning bear.

How the times have changed. Liberals actively pressuring companies to control information flow. Liberals actively pressuring companies to make distasteful opinions/information invisible.

I support FBs right to control the content that it shows. It’s a publically traded firm in the private sector. CNN for similar reasons should be allowed to choose what content it displays.

Reply

24 ʕ•ᴥ•ʔ January 2, 2018 at 5:37 pm

My old line is that if you see me as left, you must be right. This was backed up by some self-testing on the political compass.

In fairness though, it might be tough to distinguish an animated centrist fighting right wing excesses, from someone on the left.

You might have to wait for a left government doing left excesses to see the whole me.

Reply

25 ʕ•ᴥ•ʔ January 2, 2018 at 5:46 pm

Current score: Economic Left/Right: -0.63 Social Libertarian/Authoritarian: -2.15

Reply

26 clockwork_prior January 2, 2018 at 10:44 am

Wait, Putin must be involved, right? ‘Red flags’ bring up the bad old better dead than red days, when the KGB knew how to handle a good disinformation campaign.

Reply

27 Butler T. Reynolds January 2, 2018 at 10:50 am

Google News has tried to overcome some of the fake news problem by not allowing you to edit or hide their “Top Stories” section. I loathe Team Red and Team Blue equally, so their bias regarding what’s important in that section doesn’t bother me in a team sport kind of way.

What I do not appreciate is Google deciding what information is good for me. It took a while, but I trained myself to not even notice that section when I’m on the page. My eyes automatically go to my customized news feeds.

Reply

28 anon January 2, 2018 at 11:00 am

Noah Berger is just the AP photographer who took the picture. The report is by Sara Fischer.

Reply

29 clockwork_prior January 2, 2018 at 11:24 am

Does this mean someone needs to post a red flag?

Reply

30 Tom January 2, 2018 at 11:45 am

They have been undermined by their own petard.

Reply

31 rayward January 2, 2018 at 11:50 am

Social media is the End of Intelligence.Yes, that’s a play on artificial intelligence, which some worry will replace humans in work. That may or may not be true, but what is true is that AI has already replaced the intelligence of those obsessed with social media.

Reply

32 Hazel Meade January 2, 2018 at 11:53 am

Related articles … as in Snopes.

Reply

33 JWatts January 2, 2018 at 2:04 pm

Ah Snopes, one of the best sites on the web, until it fell prone to a leftward bias sometime in the past decade. It used to be my go to site for any obvious debunking. Then at some point, instead of showing the relevant facts about a topic, it started interjecting obvious opinion into pieces and leaving out facts that didn’t fit the narrative.

Reply

34 JWatts January 2, 2018 at 2:09 pm

I’m not positive it was just a leftward bias, it could have just been a general decline in quality coupled with a lot of left leaning staff members. The decline might have been a direct result of Barbara Mikkelson (one of the two married founders) dropping out as a contributor sometime in the early 2010’s.

Reply

35 The Anti-Gnostic January 2, 2018 at 11:58 am

Facebook’s executives seem really intent on exorcising their former clean, cool, decentralized aesthetic but I guess that’s where the money is. I can’t keep track of all the clutter, and I’m uncomfortable with the software recommending I “friend” 1) people with whom I have a strictly professional relationship, and 2) complete strangers, so I’ve stopped using it. I always thought the most marketable thing about Facebook was its seeming exclusivity. MySpace didn’t seem to think that was very important to people so it emptied out pretty quick.

This arc seems hard to avoid. I remember when Restoration Hardware first hit the market and it was like a group of wealthy steampunk fans got together and said to hell with it, we want a giant propeller in the living room. Now it’s Rooms To Go for rich people.

Here’s how old I am: I remember when Banana Republic sold Israeli and Brazilian military surplus.

Reply

36 Hazel Meade January 2, 2018 at 12:32 pm

On a related note, I’m starting to think I’m going to ban our kids from having any social media account until they are at least 16. Social media is a vice that is mentally unhealthy enough for adults. Using Twitter or Facebook is the psychological equivalent of chain smoking. People need to reach a certain level of emotional maturity to safely use social media, and pre-adolescent children don’t have it.
It’ll probably be hard to stop, but so is getting them to refrain from sex drugs and alcohol.

Reply

37 msgkings January 2, 2018 at 12:44 pm

Good call, same here. Not hard for me, I’m not even on Facebook and never have been. Don’t miss it one bit.

Reply

38 Hazel Meade January 2, 2018 at 12:55 pm

I’m including even “kids only” social media in this though. It’s not just Facebook, it’s all social media as its currently constructed. I’m willing to postulate that some kind of not unhealthy social media could be developed, but we’re just not there yet. We need another generation or so to develop the norms and for the appropriate kinds of online spaces to evolve.

Reply

39 Willitts January 2, 2018 at 2:10 pm

“Social” anything tends to degrade to the lowest common denominator.

Sartre knew what he was talking about in Huis Clos.

40 The Anti-Gnostic January 2, 2018 at 1:01 pm

I am pretty confident in saying that for most people under age 25–maybe even under age 35–Facebook is just plain not cool. That’s probably why its executives are so eager to get cellphones and internet for Africa instead of things like wellhead covers and waste disposal.

Reply

41 Hazel Meade January 2, 2018 at 1:07 pm

Instagram and Snapchat are just as bad if not worse.

Reply

42 Willitts January 2, 2018 at 2:07 pm

My girls are too busy to have time for social media. School, music lessons, family activities, homework, chores, reading, church, friends.

They know I’ll swat their asses if I catch them using Snapchat filters. 🙂

Reply

43 Matthew January 2, 2018 at 12:03 pm

I still don’t understand why anyone thinks this is any more of a problem (if it even is one at all) than it has been since the days of The National Enquirer. People are entertained by fake news. Sensationalized distortions of reality are very entertaining.

Reply

44 Willitts January 2, 2018 at 2:12 pm

Exactly. I can enjoy a laugh or rise from fake news while knowing it’s 100% false. The Onion is beginning to look less like satire.

Reply

45 ʕ•ᴥ•ʔ January 2, 2018 at 12:07 pm

By the way, the most recent Ritholtz podcast (with Anil Dash) was on the same issue, and the difficulties of solution.

Reply

46 JK Brown January 2, 2018 at 12:54 pm

“Instead it will use related articles to give people more context about a story.”

If only the “education” cartel taught kids how to study.

“The student has accomplished much when he has discovered some of the closer relations that a topic bears to life; when he has supplemented the thought of the author; when he has determined the relative importance of different parts and given them a corresponding organization; when he has passed judgement on their soundness and general worth; and when, finally, he has gone through whatever drill is necessary to fix the ideas firmly in his memory. Is he then through with a topic, or is more work to be done?”

[There is, besides incorporating the ideas into the students general knowledge for use in general thinking, the student is advised to develop a tentative rather than fixed attitude toward the knowledge, i.e., expect it to be challenged and modified, and to determine their own (indiivdual) opinion of the ideas before considering the opinions of others. ]

p232
“The young student should come to regard acquaintance with the varying views as necessary to the formation of a reliable opinion on any topic and of sound judgement in general.”

How to Study and Teaching How to Study (1909) by F. M. McMurry, Professor of Elementary Education, Teachers College, Columbia University

Reply

47 TallDave January 3, 2018 at 2:21 pm

Why should I believe that?

Reply

Leave a Comment

Previous post:

Next post: