Will computer facial recognition take away our ability to lie?

by on June 10, 2014 at 10:25 am in Education, Law, Science | Permalink

Upon testing, the system developed by Bartlett managed—in real time—to identify 20 of the 46 facial movements described in the FACS, according to a March report by Bartlett in Current Biology. And, even more impressive, the system not only identifies, but distinguishes authentic expressions from false expressions with an accuracy rate of 85 percent, at least in laboratory settings where the visual conditions are held constant. Humans weren’t nearly as skilled, logging an accuracy rate of about 55 percent.

Yes, Bartlett incorporated a lie detector into the facial recognition technology. This technology promises to catch in the act anyone who tries to fake a given emotion or feeling. Facial recognition is evolving into emotional recognition, but computers—not just people—are the ones deciding what’s real. ( If we add voice detection to face recognition, we end up with a complete lie detection package.)

…So we can begin to imagine a near future in which we’re equipped with glasses that not only recognize faces and voices, but also truths and lies—a scenario that would provoke a revolution in human interaction. It would also constitute a serious limitation on the individual’s autonomy.

Speculative, but we can expect these techniques to improve.

Dan Weber June 10, 2014 at 10:35 am

If technology brings a dystopian future, it’s because of things like this.

Thelonious_Nick June 10, 2014 at 11:22 am

I’m pretty sure the Butlerian jihad is about 20 years away.

JWatts June 10, 2014 at 11:40 am

“If technology brings a dystopian future, it’s because of things like this.”

I’m not convinced that effective “Lie Detection” will lead to a dystopian future. I think it’s just as likely to lead to a future that’s distinctly better.

Thelonious_Nick June 10, 2014 at 11:53 am

Every once in a while you run into a person that’s incapable of lying, or philosophically committed to not doing so. These people are generally perceived as aggressively anti-social and rarely have friends or active social lives. I don’t see how inflicting this on everybody will makes things better.

JWatts June 10, 2014 at 12:24 pm

“I don’t see how inflicting this on everybody will makes things better. ”

How exactly does that happen? I suspect that in the future your spouse, employer, family and friends will not slap a lie detector on you for every conversation. And I doubt we’ll see a future where everyone is walking around with some version of Google glasses and a lie detector built in.

Instead, like pretty much every knew technology ever invented, it will be used where deemed important and not used where it’s deemed trivial or harmful. I suspect that there will be some heated debate on where you draw the line, I just don’t think the result will be “dystopian”.

Dan Weber June 10, 2014 at 2:51 pm

I was thinking North Korea more than the US. Without the ability to pretend to support the people in power, minorities can be crushed.

If any tech causes a dystopia, it will be stuff that enables dissidents to be weeded out, not stuff like the NSA sucking up everyone’s information.

JWatts June 10, 2014 at 7:37 pm

“I was thinking North Korea more than the US.”

I think North Korea has already beaten us to the Dystopian future. Though I’d agree that accurate Lie Detection would probably make a lot of autocratic socialist societies worse, China included.

Marian Kechlibar June 11, 2014 at 2:20 am

Give law enforcement enough power, and North Korea emerges. Perhaps with elections, but meaningless elections of figureheads, while the real power is held by the uniforms.

There are enough laws on books to arrest and convict everyone for multiple offences if needed. With ubiquitious physical and digital surveillance, it is easy to terrify the population to absolute submission.

prior_approval June 10, 2014 at 12:04 pm

‘think it’s just as likely to lead to a future that’s distinctly better.’

So, you honestly think that a current or future NSA head will face one of these truth devices?

Snowden has already proved just how thoroughly lying is part of the job, after all.

And he is the one facing prison because of his oath breaking, not those who lied to Congress.

Marian Kechlibar June 11, 2014 at 2:23 am

It will be almost certainly illegal to use such devices against LEOs in course of their duties. Probably under some stalking statute. For the powerful, surveillance is a one-way street.

That said, at least politicians will not be able to dodge this. Public appearances and televize speeches / discussions are an integral part of the job. Maybe they will resort to wearing masks Darth Vader style.

Marian Kechlibar June 11, 2014 at 2:29 am

It depends on who is allowed to use it on whom.

Just like with any other technology.

Jeff June 10, 2014 at 12:25 pm

On the plus side, presidents will be making a lot fewer public appearances.

Ryan June 10, 2014 at 10:44 am

Integrate with google glass, and all the PUAs and politicians go running. Or, their skills sharpen. Darwin vs technology. The race is on.

Mark Thorson June 10, 2014 at 10:44 am

The traditional polygraph is a fraud. Its purpose is to intimidate people who buy into the myth of its magical powers. An improved polygraph is like improved homeopathic medicine or improved astrology.

ShardPhoenix June 10, 2014 at 8:20 pm

If it actually works this time, how is it a fraud?

Mark Thorson June 11, 2014 at 12:14 am

If an improved astrology actually works, then I suppose it isn’t a fraud.

Anonymous June 10, 2014 at 10:46 am

After this technology becomes prevalent, people who are so good at lying that they can fool the algorithm become gods. It will be like in the movie “The Invention of Lying”.

JWatts June 10, 2014 at 11:41 am

“people who are so good at lying that they can fool the algorithm become gods”

How is that different than our current state of affairs? Do you think any of our current politicians have gone a year without a knowing lie? A month even?

Silas Barta June 10, 2014 at 4:25 pm

I wouldn’t say they’ll become gods, but there will definitely be an arms race as people learn ways they can break the assumptions that the algorithms work off of.

eddie June 10, 2014 at 11:07 am

If this technology becomes pervasive, teenagers will learn to beat it trivially. In the process, the social and behavioral cues which the algorithms were programmed to identify will be unlearned by the socially-aware and technologically-aware (i.e. teenagers) and new cues will evolve and become widely adopted.

A ubiquitous device which gives you instant feedback on “this looks like truth” and “this looks like a lie” is not a lie detector; rather, it’s a perfect training tool for teaching everyone how to lie.

Keith June 10, 2014 at 11:15 am

Humans are going to lose that race.

Andrew' June 10, 2014 at 1:13 pm

Which race?

Keith June 10, 2014 at 5:27 pm

The arms race between humans who will try to lie well enough to beat the machines, and the machines (plus their programmers) who will get better at detecting lies.

Slocum June 10, 2014 at 1:26 pm
B.B. June 10, 2014 at 11:16 am

Those who make the most trouble in our society are often sociopaths. Alas, sociopaths can beat lie-detectors because they have no guilty physical reactions when they tell lies.

I am wondering if sociopaths have facial expressions which communicate when they lie. They may be in the 15% for which the new technology does not work.

Lie detection works worse for those who may be the most criminal. Sad.

Another issue is self-delusion. Someone may truly and sincerely believe something which is demonstrably false, and those people will not be found to be lying even though they are telling falsehoods. That could be a problem with witness testimony. THe machine says the witness is telling the truth, but the witness is only reporting delusions.

Mark Thorson June 10, 2014 at 11:55 am

That’s just another piece of the myth of the lie detector. It lends support to the notion the polygraph is an actual lie detector when you say it works most of the time, but there’s this rare class of people on whom it doesn’t work because they have an abnormal psychology which makes them believe their own lies.

The polygraph is an intimidation device to assist the police in getting confessions out of suspects, nothing more.

Tyler Fan June 10, 2014 at 11:22 am

It’s already amazing what you can tell of someone’s mind with an fMRI. Given some Moore’s Law type drop in the cost and size of fMRI, and augmented with some of this other voice and facial technology, and the day of the portable nearly infallible lie detector may arrive sooner than we think. At that point, we will tell our lies through blog posts and via media where physical manifestations of untruthfulness remain indetectable.

mpowell June 10, 2014 at 11:47 am

I am much more likely to believe that you can use MRI readings to tell when someone is lying than that you have a fool proof facial recognition program. It’s just to easy to mask facial responses most of the time.

sailordave June 10, 2014 at 6:48 pm

the size of an fMRI is limited by the need to stick your head inside it

Nathan W June 10, 2014 at 7:48 pm

Better watch out we don’t end up with a Minority Report-like outcome!

He thought it. Down he goes … never mind whether the world may have whispered the thought into his ears continuously for years on end.

Dismalist June 10, 2014 at 12:41 pm

The start of just another arms race.

Brian Donohue June 10, 2014 at 1:06 pm

The best liars lie to themselves. I’m guessing those are the 15% who slip through.

DW June 10, 2014 at 1:12 pm

How about false positives?

Bill June 10, 2014 at 1:31 pm

If facial recognition takes away your ability to lie….

I suggest….

You do a blog.

Urso June 11, 2014 at 2:22 pm

My thought exactly. Heck Prof. Cowan has made a living(?) out of his Sphinxian schtick, which the blog medium makes possible. Is he being dead serious? Dismissive? Is he smirking? Who knows?

Rob June 10, 2014 at 2:16 pm

If we were in a world where lying was very hard, would we choose to make it easier?

There are some situations where we don’t mind dishonesty. In those situations it won’t be socially acceptable to use these devices.

Vivian Darkbloom June 10, 2014 at 2:48 pm

Another use for the anti-surveillance mask. He clearly wasn’t lying when he wrote “look for this to be an issue”.

http://marginalrevolution.com/marginalrevolution/2014/05/anti-surveillance-mask-look-for-this-to-be-an-issue.html

Jacob A. Geller June 10, 2014 at 2:51 pm

“So we can begin to imagine a near future in which we’re equipped with glasses that not only recognize faces and voices, but also truths and lies—a scenario that would provoke a revolution in human interaction.”

Nah, not gonna happen. Some technologies that are efficient in an Econ 101 sense are nonetheless rejected (at least temporarily) due to social norms — Bluetooth headsets; Google Glass; office buildings that stay open 24/7; driverless cars — and I think this is one of those technologies.

Consider norms around privacy and personal space — sure, many people are willing to volunteer personal information on the web, but who wants to spend time with somebody wearing lie-detection glasses? Or glasses that let that person surf the web or watch a movie while you’re at work or hanging out? Or glasses that can see through clothing?

Joseph Ward June 10, 2014 at 3:12 pm

or just change how we lie

aretae June 10, 2014 at 4:07 pm

Of course, We’ve all read “The Truth Machine”?

Chauncy June 10, 2014 at 5:18 pm

There is no technology or skill that can detect someone who believes their own lies.

Nathan W June 10, 2014 at 7:45 pm

Indeed, it would be nice to be able to separate the ignorant from those who intentionally propagate BS because it is more aligned with outcomes which fit better with their ideological perspective.

Marian Kechlibar June 11, 2014 at 2:35 am

Nathan, people lie in all possible situations, not just for political purpose. Many of the lies are distinctly white, or at least not thoroughly black. Even saying nothing is sort-of a lie, when you actually disagree with something being said aloud.

Devices like this are perfect for the already powerful to cement their power over the rest. I cannot see how this is a good thing. Positively Orwellian.

Nathan W June 10, 2014 at 7:36 pm

People learn to lie with their eyes, but not so easily with their whole faces.

Personally, I don’t even take words that seriously if they are not matched by action or build in a way that makes it ridiculously apparent that they believe what they are saying (e.g., not offering facetious arguments).

ShardPhoenix June 10, 2014 at 8:26 pm

I think people underestimate how much this technology (reliable lie detection, not necessarily this specific implementation – I think brain scanning will be more accuratge) will change society. Even if it’s not used socially, having court decisions become far more accurate would be a huge win.

Of course there are less good scenarios where it’s only used on the non-poweful – ideally it would be used on politicians first and foremost!

C. Johnson June 10, 2014 at 10:21 pm

I know economists like to incompetently pretend everything is linear, but you have to understand it’s not, and every computer science blurb reported by journalists doesn’t always take into account scale and runtime.

When you have an algorithm with complexity in space or time of n^2 or n^3, there’s a long way to go before practical applications, and “Moore’s law” doesn’t handwave everything away. In other words, a task like recognizing a million people in the real world isn’t as easy as 100 in a laboratory.

Marian Kechlibar June 11, 2014 at 2:28 am

In two-party systems, this could actually break the privacy of voting. Let us say that there is a poster (or a leaflet) of one of the parties somewhere in the proximity of the poll station. Now just measure the microexpressions of the voters when they see the poster while walking to the station. Negative emotion – probably votes for the other party.

This is harder to do in proportional systems, of course. But even there, the realistic spread of the vote is among 5-6 parties or so.

Walt G June 11, 2014 at 11:37 am

Take away our ability to lie? Not until they pry the keyboard from my cold, stiff fingers.

celebrity long haircuts 2013 June 27, 2014 at 10:52 pm

In simple words, fashion illustration gives a freedom to
the artist to put forward the fashion designs and ideas in front of the entire world.

Instead try a nice pair of wool or cotton chino dress trousers with a button down shirt, sweater vest, tie, and sport jacket.
Broadband and problematic stunts are their pastime right now.

Comments on this entry are closed.

Previous post:

Next post: