How are you feeling? AI wants to know

‘Emotionally intelligent AI is not to dazzle but to unobtrusively anticipate a person’s needs in the context of the service you’re providing’

“It should always be very clear that you’re talking to a person or you’re talking to a bot. We must proceed with transparency or else the world will get really weird really quickly”

How are you feeling today? This is the question that a new generation of artificial intelligence is getting to grips with. Referred to as emotional AI, these technologies use a variety of advanced methods including computer vision, speech recognition and natural language processing to gauge human emotion and respond accordingly.

Prof Alan Smeaton, lecturer and researcher in the school of computing, Dublin City University (DCU), and founding director of the Insight Centre for Data Analytics, is working on the application of computer vision to detect a very specific state: inattention.

Necessity is the mother of invention and Help Me Watch was developed at DCU during the pandemic in response to student feedback on the challenges of online lectures.

“Attending lectures via Zoom is distracting if you have to do this in busy spaces like the family kitchen. And you’re not amongst classmates, you’re on your own; it’s easy to get bored with that and let your attention stray,” says Smeaton.

READ MORE

“We developed an application that uses the student’s laptop webcam to detect their face. It doesn’t matter if they are far away, near the webcam or even moving around. Help Me Watch uses face tracking and eye-gaze detection to measure attention levels throughout the lecture.”

Help Me Watch’s dashboard allows the lecturer to observe overall patterns to see what material went down well and what was less engaging. Arguably the use-case for the individual student is more compelling: it notices if someone zones out for part of the lecture that they should have been paying attention to and sends them what they missed out on.

Another DCU project – led by Susanne Little – also measures attention levels but in the context of driver fatigue monitoring. Smitten explains that this application of computer vision is particularly challenging due to the fluctuating light levels encountered as a motorist moves through different lighting conditions but the use case is valuable: it would make for an important feature for long-haul drivers in particular.

Facial-imaging data

In the space of emotional AI, researchers and startups are working with sensitive and personally-identifiable data like the facial-imaging data mentioned above, but also voice, text and even heart rate or galvanic skin response (how sweaty someone’s skin is).

In capturing faces on camera and processing the subsequent data Smeaton points out that this is done in compliance with GDPR and all data is anonymised so that a lecturer is not seeing an individual student’s name but rather numerical identifiers such as “student 123”.

Beyond data-compliance laws there are other ethical considerations. Dr Alison Darcy, psychologist and founder of digital therapeutics start-up Woebot, says that transparency is essential in order to establish trust with the end-user.

“AI should always announce itself,” she says in reference to Google Duplex, the human-sounding AI assistant first demonstrated in 2018. While some were wowed by the eerily natural voice complete with ‘ums’ and ‘ahs’, AI ethicists were concerned the person on the other end of the phone mistakenly thought they were talking to another human. Google responded by promising to build in an automated announcement alerting the user that they were interacting with an AI assistant.

“It should always be very clear that you’re talking to a person or you’re talking to a bot. We must proceed with transparency or else the world will get really weird really quickly,” adds Darcy.

Her creation Woebot is an AI-powered therapeutic chatbot developed to help the user using principles of cognitive behavioural therapy (CBT) including mood monitoring or self-tracking.

“How Woebot responds to emotions will depend on the emotional and cognitive state of the individual. If somebody is really upset it won’t start joking around with them; it delivers appropriate empathy and invites the user to be guided through an evidence-based technique to help with the intense emotional state they are experiencing in that moment.”

The app also changes tone or verbal complexity if necessary. As Darcy explains, if somebody is in a really difficult emotional state they have less cognitive capacity to be parsing long, complex sentences so Woebot’s verbosity goes down and the warmth in tone is retained while including less humour.

Self-awareness

“A lot of people assume Woebot passively detects the user’s emotional state [using sentiment analysis techniques] but we have stated from day one that we won’t do that,” says Darcy.

“Woebot asks the user how they are feeling because it’s more important to facilitate emotional self-awareness for the user than it is for Woebot to detect their emotional state. Telling an individual that they sound upset, for example, can make them defensive and cause them to back off.”

Another way AI can be empathetic is by being there when most needed. Woebot has developed a separate application designed to be used by women monitoring their well-being during pregnancy and for postpartum mental health. Darcy says that 78 percent of all postpartum conversations occur between 10pm and 5am, times when new mothers don’t get much sleep and need to talk. It might be impossible to find a therapist during these hours so this chatbot offers a lifeline.

While some may think that clever chatbots can’t possible engage with someone on the level that a real-life therapist can, Woebot’s peer-reviewed study of 36,000 of its users suggests otherwise. It was found that the user establishes a therapeutic bond with Woebot within three to five days of use, something that was previously considered to be uniquely human-to-human.

And although it seems counterintuitive, other studies suggest that people feel more relaxed disclosing personal information to an AI entity than to another human being, says Paul Sweeney, EVP of product at conversational middleware platform Webio.

“It is easier to tell an intelligent assistant about sensitive issues like financial difficulty than it is to tell a person on the other end of the phone,” he says.

Webio creates intelligent chatbots for clients in the financial space. These chatbots are far more advanced than the traditional FAQ or rules-based ones found on many websites. Customers can train a unique chatbot on their company data to teach it how to interact more effectively with their customers and, similar to aspects of Woebot’s functionality, the tone or formality of speech can be tweaked.

“Just changing the language can help. One of our customers got a 30 per cent improvement in responses because we changed the tone and language they were using.

“Webio is automating the human touch to customer service. The interface knows when it can’t help you and automatically connects you with a human agent that can. And over time it gets better as long as there is a human in the loop, improving its decision-making,” says Sweeney.

The emotionally intelligent part of Webio’s tech is that it can tell if a customer is anxious about, say, paying off their credit card. Part of the natural language processing techniques used involves vulnerability assessment. Older customers, for example, may be more vulnerable and their query is therefore prioritised.

Voice biomarkers

Sweeney is interested in other, as he puts it, “flashier” kinds of emotional AI like real-time speech emotion detection that uses voice biomarkers such as tone, stress levels and emotional content. They can be very accurate, he says, but this area is going to be fraught and we will have to proceed with caution.

“People can say things they don’t mean. They can use quirks of language that mean one thing to them and another to you. You have to be very careful in how these technologies are used.

“The point of an emotionally intelligent AI is not to dazzle but to understand the person and unobtrusively anticipate their needs in the context of the service you’re providing. Relax them, be more inviting, open and empathetic, use language better – and then invite people into the conversation.”