BUILDING AN AI THAT FEELS-AI systems with emotional intelligence could learn faster and be more helpful.
Lately, we've been considering how we could improve AI voice assistants such as Alexa and Siri, which many people now use as everyday aides. We anticipate that they'll soon be deployed in cars, hospitals, stores, schools, and more, where they'll enable more personalized and meaningful interactions with technology. But to achieve their potential, such voice assistants will require a major boost from the field of affective computing. That term, coined by MIT professor Rosalind W. Picard in a 1997 book by the same name, refers to technology that can sense, understand, and even simulate human emotions. Voice assistants that feature emotional intelligence should be more natural and efficient than those that do not.
Rational and emotional thinkingConsider how such an AI agent could help a person who's feeling overwhelmed by stress. Currently, the best option might be to see a real human psychologist who, over a series of costly consultations, would discuss the situation and teach relevant stress-management skills. During the sessions, the therapist would continually evaluate the person's responses and use that information to shape what's discussed, adapting both content and presentation in an effort to ensure the best outcome.
While this treatment is arguably the best existing therapy, and while technology is still far from being able to replicate that experience, it's not ideal for some. For example, certain people feel uncomfortable discussing their feelings with therapists, and some find the process stigmatizing or time-consuming. An AI therapist could provide them with an alternative avenue for support, while also conducting more frequent and personalized assessments. One recent review article found that 1 billion people globally are affected by mental and addictive disorders; a scalable solution such as a virtual counselor could be a huge boon.
Humanoid BrainThere's some evidence that people can feel more engaged and are more willing to disclose sensitive information when they're talking to a machine. Other research, however, has found that people seeking emotional support from an online platform prefer responses coming from humans to those from a machine, even when the content is the same. Clearly, we need more research in this area.
Special thanks to IEEE Spectrum for providing resources.IEEE SPECTRUM
Comments
Post a Comment