The logical end point is that we will abandon trying to interpret the moods and expressions on others’ faces, and instead rely on devices to instantaneously perform the task for us. Motherboard on a computer that Ohio State University researchers trained to recognize complex and subtle emotions far more skillfully than humans are able to:
For a while now, facial analysis software has been able to distinguish between the six “basic categories” of emotion—happiness, surprise, anger, sadness, fear, and disgust. If you asked me to do the same, I could probably do it. But when you drill down into complex, compound facial expressions such as “happily surprised,” “fearfully angry,” “appalled,” “hatred,” and “awed,” I’d probably blow a couple of them. This computer doesn’t. In fact, it can decipher between 21 different “complex emotions.”
It’s another step towards machines that can decipher what we feel… in [this] context, it’s easy to imagine a future filled with robotic companions and therapists.