How MIT Building App To Detect Emotions In Conversation Using AI

Over the phone you may not understand the emotions in a conversation, but Massachusetts Institute of Technology (MIT) has newly researched a solution to it. It has come up with a wearable app to parse conversation and identify emotion existing in the story.


MIT has used artificial intelligence (AI) in a newly developed fitness tracker to collect physical as well as speech data to help the technology analyze overall tone of story in conversation and that too in real time.

The researchers say in-built AI in the wearable app is able to discover which part of the story in conversation is sad or which part is happy. In every five-second intervals the app tracks changes in emotion.

It is yet to be known whether the latest finding of MIT has been peer-reviewed, but claimed to have determined emotional tone with 83 percent accuracy in a total of thirty-one trial conversations. It offers more granular sentiment scores.

mit - ratificial intelligence

All the participants in MIT research were made to wear a Samsung Simband in which the new app was installed. While the conversation the band monitored physical changes too like heart rate, skin temperature and movements like fidgeting or waving arms around. The research was unlike other similar works where subjects were asked to watch happy or sad videos.

Samsung Simband is a modular, research-centric wrist wearable that can be easily tricked with sensors and custom algorithms can be run on its hardware.

The subjects who had monotonous vocal tones or who had long pauses frequently were described by the AI as sad and varied speech patterns were termed as happy.

mit app emotion detection

The researchers add more complex emotions may soon be recognized by the app.

According to a graduate student being part of the research team their work can be used to help those with anxiety or conditions like autism or Asperger’s.

Tuka Alhanai added they have taken a step in such direction when people will have an AI social coach in their pocket.

He further mentioned, “Developing technology that can take the pulse of human emotions has the potential to dramatically improve how we communicate with each other.”

Another MIT student, PhD candidate Mohammad Ghassemi said the AI system used in their research uses specialized algorithms for the analysis of audio and text transcriptions along with physiological signals to determine emotional tone in real-time.

Last fall a team from the same MIT’s Computer Science and Artificial Intelligence Laboratory worked on building a device that could identify human emotions using wireless signals.

Leave a Reply

Your email address will not be published. Required fields are marked *