Computers Can Now Read Human Emotions Better Than You Can

emotionsThe logical end point is that we will abandon trying to interpret the moods and expressions on others’ faces, and instead rely on devices to instantaneously perform the task for us. Motherboard on a computer that Ohio State University researchers trained to recognize complex and subtle emotions far more skillfully than humans are able to:

For a while now, facial analysis software has been able to distinguish between the six “basic categories” of emotion—happiness, surprise, anger, sadness, fear, and disgust. If you asked me to do the same, I could probably do it. But when you drill down into complex, compound facial expressions such as “happily surprised,” “fearfully angry,” “appalled,” “hatred,” and “awed,” I’d probably blow a couple of them. This computer doesn’t. In fact, it can decipher between 21 different “complex emotions.”

It’s another step towards machines that can decipher what we feel… in [this] context, it’s easy to imagine a future filled with robotic companions and therapists.

3 Comments on "Computers Can Now Read Human Emotions Better Than You Can"

  1. Anarchy Pony | Apr 9, 2014 at 12:06 pm |

    Oh great. It’s just a few short steps from here to a.i. trolls.

  2. Statistical data average of face variables is the easy problem…. Recognition is easy.

    Understanding is the hard problem. As is proper reaction.

  3. Dread Raider | Apr 9, 2014 at 4:53 pm |

    I wonder if we compare the amount of resources and cash going into this kind of project, if it is being matched by projects teaching humans the same thing?

    I doubt it, and if not then who cares if computers are better at it?

Comments are closed.