Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

Source- fortune.com

Artificial intelligence requires us to draft a social contract with our technology, said Rana El Kaliouby, co-founder and CEO of emotion AI company Affectiva, who presented on emotion and AI at Fortune’s Brainstorm Reinvent conference in Chicago on Monday. We’ve got to trust it, she explained.

To build that trust between humans and technology, El Kaliouby said that empathy is key. In other words, machines have to understand the humans using them. When an Amazon Alexa doesn’t understand its owner’s request, it becomes quite frustrating to the user. El Kaliouby thinks that consumer frustration boils down to Alexa’s lack of empathy.

But, she asked the audience, “What if a computer could tell the difference between a smile and a smirk?”

The face is the gateway to human emotion and interaction. Scientists have been studying facial emotions for hundreds of years. Building off of the work that psychologist Paul Ekman did by mapping facial muscles into action units, AI developers like El Kaliouby can today teach machines to recognize human emotion and react to it.

This empathetic technology is already being used in market research and advertising, according to El Kaliouby. She said that nearly a quarter of all Fortune 500 companies currently use AI to gauge the emotional impact of their advertisements. Individuals can also use the technology to measure their own facial movements to improve their interpersonal interactions or public speaking, she said.

Going forward, she added, emotion AI can be used by teachers to measure how well students are absorbing their lessons, by doctors to help assess the mental health of their patients, and in cars to take the wheel from a drowsy driver.

El Kaliouby wrapped up her presentation by noting that, though there are myriad ethical implications in the increased use of AI, technology is ultimately neutral. Humans choose whether to use it for good or evil and whether to build or break trust.

Related Posts

Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
1
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence