Article: AI robots with empathy? Insights from Dr Anna Tavis of NYU

Culture

AI robots with empathy? Insights from Dr Anna Tavis of NYU

A look at artificial intelligence built with the blueprint of human emotions
AI robots with empathy? Insights from Dr Anna Tavis of NYU

“Can artificial intelligence have empathy?”

A few brave souls raised their hands when New York University’s Dr. Anna Tavis asked her audience.

The more sophisticated AI tools today don’t just simulate how humans think – they can also decode how humans feel.

Think of facial recognition software that can interpret a person’s sentiments based on their micro-expressions. Or chatbots that can infer a person’s mood while they are typing – based primarily on their keyboard strokes. These advancements signal the beginning of what Dr. Tavis calls the age of emotion AI.

More than a co-pilot, AI is a confidant

No longer just a co-pilot in everyday work, this type of intelligence is designed to read the language of human emotions by looking into expressions and patterns of behaviour and reacting to them appropriately.

AI built with the blueprint of human emotions can serve as a task assistant, companion and even confidant for some. It's taking the leap from artificial intelligence to artificial empathy.

As Dr. Tavis, a clinical professor and chair of the Human Capital Management Department at NYU, puts it: emotion AI delves into what makes us fundamentally human and “different from those machines” – that is, our jumble of emotions, desires, hopes, fears, insecurities and frustrations – broken down and translated into programming language.

Of course, most headlines about AI tend to pit humans against machines. Yet, the more optimistic stories often depict human capabilities – such as the capacity to feel, ruminate, and hope – as being far more sublime and superior to the computing powers of AI.

However, the world is also reaching a point in AI development where human emotions can now be simulated.

How robots are programmed to show empathy

“When we say empathy, we think, ‘I have to be walking in your shoes, understanding how you feel’. That’s one level of it: cognitive empathy,” said Dr. Tavis, who recently published her latest book, The Digital Coaching Revolution

“But there’s another level of empathy where – based on how I see you behave – I can actually intellectually figure out how you feel. And that’s where the machines are going,” she said.

AI developers are “targeting the recognition of the signs of emotions and programming them into these tools. [Robots] don’t feel, but they act like they feel. They don’t understand what you’re going through. However, they can respond in a way that would look like they do,” Dr. Tavis said.

In fact, one branch of computer science – called affective computing – specialises in the development of machines, especially AI systems, that can “look at this language, recognise, understand, and simulate human emotion”.

The point of affective computing is to advance the well-being of humans by mapping out their emotions more precisely.

After all, “how can a machine effectively communicate information if it doesn’t know your emotional state?” Dr. Tavis asked. “If it doesn’t know how you feel, and if it doesn’t know how you’re going to respond to specific content? That is the question that is in front and centre of people who are designing this new technology.”

AI tools today are no longer about making us more efficient or productive. “That is the price of admission,” Dr. Tavis said. Nowadays, “they’re targeting something very sacred to who we are as humans.”

Read full story

Topics: Culture, Technology, #Artificial Intelligence

Did you find this story helpful?

Author

QUICK POLL

How do you envision AI transforming your work?