Computers Can Now Read Human Emotions Better Than You Can

emotionsThe logical end point is that we will abandon trying to interpret the moods and expressions on others’ faces, and instead rely on devices to instantaneously perform the task for us. Motherboard on a computer that Ohio State University researchers trained to recognize complex and subtle emotions far more skillfully than humans are able to:

For a while now, facial analysis software has been able to distinguish between the six “basic categories” of emotion—happiness, surprise, anger, sadness, fear, and disgust. If you asked me to do the same, I could probably do it. But when you drill down into complex, compound facial expressions such as “happily surprised,” “fearfully angry,” “appalled,” “hatred,” and “awed,” I’d probably blow a couple of them. This computer doesn’t. In fact, it can decipher between 21 different “complex emotions.”

It’s another step towards machines that can decipher what we feel… in [this] context, it’s easy to imagine a future filled with robotic companions and therapists.

, , , ,

  • Anarchy Pony

    Oh great. It’s just a few short steps from here to a.i. trolls.

  • http://hormeticminds.blogspot.com/ Chaorder Gradient

    Statistical data average of face variables is the easy problem…. Recognition is easy.

    Understanding is the hard problem. As is proper reaction.

  • Dread Raider

    I wonder if we compare the amount of resources and cash going into this kind of project, if it is being matched by projects teaching humans the same thing?

    I doubt it, and if not then who cares if computers are better at it?

21