Over the phone you may not understand the emotions in a conversation, but Massachusetts Institute of Technology (MIT) has newly researched a solution to it. It has come up with a wearable app to parse conversation and identify emotion existing in the story.
MIT has used artificial intelligence (AI) in a newly developed fitness tracker to collect physical as well as speech data to help the technology analyze overall tone of story in conversation and that too in real time.
The researchers say in-built AI in the wearable app is able to discover which part of the story in conversation is sad or which part is happy. In every five-second intervals the app tracks changes in emotion.
It is yet to be known whether the latest finding of MIT has been peer-reviewed, but claimed to have determined emotional tone with 83 percent accuracy in a total of thirty-one trial conversations. It offers more granular sentiment scores.
All the participants in MIT research were made to wear a Samsung Simband in which the new app was installed. While the conversation the band monitored physical changes too like heart rate, skin temperature and movements like fidgeting or waving arms around. The research was unlike other similar works where subjects were asked to watch happy or sad videos.
Samsung Simband is a modular, research-centric wrist wearable that can be easily tricked with sensors and custom algorithms can be run on its hardware.
The subjects who had monotonous vocal tones or who had long pauses frequently were described by the AI as sad and varied speech patterns were termed as happy.
The researchers add more complex emotions may soon be recognized by the app.
According to a graduate student being part of the research team their work can be used to help those with anxiety or conditions like autism or Asperger’s.
Tuka Alhanai added they have taken a step in such direction when people will have an AI social coach in their pocket.
He further mentioned, “Developing technology that can take the pulse of human emotions has the potential to dramatically improve how we communicate with each other.”
Another MIT student, PhD candidate Mohammad Ghassemi said the AI system used in their research uses specialized algorithms for the analysis of audio and text transcriptions along with physiological signals to determine emotional tone in real-time.
Last fall a team from the same MIT’s Computer Science and Artificial Intelligence Laboratory worked on building a device that could identify human emotions using wireless signals.
- How to support your child’s mental health: A parent’s guide - February 1, 2025
- Can data centers stay green? Balancing digital growth with clean energy - January 26, 2025
- Why Blockchain could be end of high fees, delays in global payments - January 17, 2025
- Abridge AI: Silent scribe transforming healthcare interactions - January 5, 2025
- What makes quantum AI a game-changer for technology - December 25, 2024
- How businesses must adapt to evolving cyber threats in 2025 - December 4, 2024
- How vaping stiffens blood vessels and strains lungs: Study - November 26, 2024
- OpenAI Codex or Google Codey? Finding the perfect AI for your code - November 18, 2024
- What Google’s Project Jarvis means for future of digital interaction - October 28, 2024
- 11 tips for creating engaging ad content - July 8, 2024



