Computer empathy

I am, of course, am always interested in the state of technology that allows people to interact with computers without using a keyboard or mouse. This report notes some interesting progress.

Yin’s team has developed ways to provide information to the computer based on where a user is looking as well as through gestures or speech. One of the basic challenges in this area is “computer vision.”… Can this data be used to “see” the user and “understand” what the user wants to do?

To some extent, that’s already possible. Witness one of Yin’s graduate students giving a PowerPoint presentation and using only his eyes to highlight content on various slides.

Yin says the next step would be enabling the computer to recognize a user’s emotional state. He works with a well-established set of six basic emotions — anger, disgust, fear, joy, sadness, and surprise — and is experimenting with different ways to allow the computer to distinguish among them. Is there enough data in the way the lines around the eyes change? Could focusing on the user’s mouth provide sufficient clues? What happens if the user’s face is only partially visible, perhaps turned to one side?

“Computers only understand zeroes and ones,” Yin says. “Everything is about patterns. We want to find out how to recognize each emotion using only the most important features.”

Perhaps we will one day live in a world where a computer can determine that we are sad and mercilessly mock us. Perhaps…

Leave a Reply

Your email address will not be published. Required fields are marked *