When our devices can read our emotions

March 1, 2019
Emotion-tracking AI is starting to help machines recognize our moods. Are we ready?

This is one of the latest editions of “Business Lab“, MIT Technology Review’s new podcast helping business leaders make sense of new technologies coming out of the lab and into the marketplace. In this episode:

Personal assistants like Siri, Alexa, Cortana, or Google Home can parse our spoken words and (sometimes) respond appropriately, but they can’t gauge how we’re feeling—in part because they can’t see our faces. But in the emerging field of “emotion-tracking AI,” companies are studying the facial expressions captured by our devices’ cameras to allow software of all kinds become more responsive to our moods and cognitive states.

At Affectiva, a Boston startup founded by MIT Media Lab researchers Rosalind Picard and Rana El Kaliouby, programmers have trained machine learning algorithms to recognize our facial cues and determine whether we’re enjoying a video or getting drowsy behind the wheel. Gabi Zijderveld, Affectiva’s chief marketing officer and head of product strategy, tells Business Lab that such software can streamline marketing, protect drivers, and ultimately make all our interactions with technology deeper and more rewarding. But to guard against the potential for misuse, she says, Affectiva is also lobbying for industry-wide standards to make emotion-tracking systems opt-in and consensual.