‘Emotion AI’ may be the next trend for business software, and that could be problematic


As businesses experiment with embedding AI everywhere, one unexpected trend is companies turning to AI to help its many newfound bots better understand human emotion. 

It’s an area called “emotion AI,” according to PitchBook’s new Enterprise Saas Emerging Tech Research report that predicts this tech is on the rise. 

The reasoning goes something like this: If businesses deploy AI assistants to execs and employees, make AI chatbots be front-line salespeople and customer service reps, how can an AI perform well if it doesn’t understand the difference between an angry “What do you mean by that?” and a confused “What do you mean by that?”

Emotion AI claims to be the more sophisticated sibling of sentiment analysis, the pre-AI tech that attempts to distill human emotion from text-based interactions, particularly on social media. Emotion AI is what you might call multimodal, employing sensors for visual, audio, and other inputs combined with machine learning and psychology to attempt to detect human emotion during an interaction.

Major AI cloud providers offer services that give developers access to emotion AI capabilities such as Microsoft Azure cognitive services’ Emotion API or Amazon Web Services’ Rekognition service. (The latter has had its share of controversy over the years.)

While emotion AI, even offered as a cloud service, isn’t new, the sudden rise of bots in the workforce give it more of a future in the business world than it ever had before, according to PitchBook. 

“With the proliferation of AI assistants and fully automated human-machine interactions, emotion AI promises to enable more human-like interpretations and responses,” writes PitchBook’s Derek Hernandez, senior analyst, emerging technology in the report.

“Cameras and microphones are integral parts of the hardware side of emotion AI. These can be on a laptop, phone, or individually located in a physical space. Additionally, wearable hardware will likely provide another avenue to employ emotion AI beyond these devices,” Hernandez tells TechCrunch. (So if that customer service chatbot asks for camera access, this may be why.)

To that end, a growing cadre of startups are being launched to make it so. This includes Uniphore (with $610 million total raised, including $400 million in 2022 led by NEA), as well as MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis, each of which also raised modest sums from various VCs, PitchBook estimates.

Of course, emotion AI is a very Silicon Valley approach: Use technology to solve a problem caused by using technology with humans. 

But even if most AI bots will eventually gain some form of automated empathy, that doesn’t mean this solution will really work.

In fact, the last time emotion AI became of hot interest in Silicon Valley — around the 2019 time frame when much of the AI/ML world was still focused on computer vision rather than on generative language and art — researchers threw a wrench in the idea. That year, a team of researchers published a meta-review of studies and concluded that human emotion cannot actually be determined by facial movements. In other words, this idea that we can teach an AI to detect a human’s feelings by having it mimic how other humans try to do so (reading faces, body language, tone of voice) is somewhat misguided in its assumption.

There’s also the possibility that AI regulation, such as the European Union’s AI Act, which bans computer-vision emotion detection systems for certain uses like education, may nip this idea in the bud. (Some state laws, like Illinois’ BIPA, also prohibit biometric readings from being collected without permission.)

All of which gives a broader glimpse into this AI-everywhere future that Silicon Valley is currently madly building. Either these AI bots are going to attempt emotional understanding in order to do jobs like customer service, sales and HR and all the other tasks humans hope to assign them, or maybe they won’t be very good at any task that really requires that capability. Maybe what we’re looking at is an office life filled with AI bots on the level of Siri circa 2023. Compared with a management-required bot guessing at everyone’s feelings in real time during meetings, who’s to say which is worse?



Source link

About The Author

Scroll to Top