The Computer Feels Your Pain

Mar 27 2014 @ 2:39pm

Machines are better than people at telling when pain is real:

In the experiment, more than 150 participants were shown short videos of the faces of people who either dunked their arms in ice water or pretended to dunk their arms in ice water. The group was asked to gauge the authenticity of each pained reaction, and succeeded in weeding out the fakers from the true sufferers only 51.9 percent of the time – no more accurate than if they simply had left their guess to chance. Next, the researchers showed the same videos to computers with special expression recognition software. The computers’ accuracy? Eighty-five percent.

That the humans performed so poorly actually is no surprise. Scientists have known for a while that even trained physicians can’t reliably differentiate between real and faked pain expressions. But the computers’ results were unprecedented.

How the experiment worked:

[Researcher Marian] Bartlett’s system is based on something called the Facial Action Coding System, or FACS, which was popularized by the psychologist Paul Ekman in the ’70s and ’80s and is used today by everyone from TSA screeners to animators trying to imbue their characters with more realistic facial expressions. It’s a way of describing virtually any facial expression that’s anatomically possible by breaking it down into its component movements — a wrinkle of the nose, a tightening of the eyelid, a lowering of the brow, and so on. The idea is that each of these movements maps onto a specific muscle or set of muscles.

Bartlett’s team has been working for years to create a computer vision system to automate FACS and to develop machine learning algorithms that can learn to recognize patterns of facial movements that correspond to particular emotions. (They also founded a company, Emotient, based on the same technology — more on that later). The new study is the first to assess how well the system distinguishes genuine from fake facial expressions and compare its performance to that of human observers.

The practical applications:

[Bartlett] thinks automated pain detection could be useful for doctors and nurses working with children. Research suggests that pain is often underreported and under treated in kids, she says. She’s also developing systems that detect more than just pain. The company she co-founded, Emotient, recently released an app for Google glass aimed initially at salespeople looking for insight into their customers’ mood. Presumably, any Google Glass wearer will eventually be able to use it.

A realtime color-coded display indicates which emotions the system is supposedly picking up in the people around you. The company claims it can accurately detect joy, sadness, anger, fear, and disgust. And if you’re being a Glasshole, the app just might clue you in: It’s also programmed to detect contempt.