Reading the room: AI to help autistic people interpret emotions better

Thursday, January 19, 2023

For most of us, social interactions are taxing, tedious or time well spent. For the 1.5% of Canadian children and youth (five to 17 years of age) who have been diagnosed with autism, they can be inscrutable.  

There is no one defining characteristic of autism. Some of the more typical manifestations of autism relate to emotional awareness. Autistic people can have difficulty recognizing emotional cues in social interactions, affecting how they read body language, tone of voice or facial expressions. This can lead to awkwardness, misunderstandings and even loneliness. It can also affect how some autistic people engage with their own emotions and interests, often causing confusion and withdrawal.

With the rise of new technologies such as artificial intelligence (AI) and machine learning (ML), however, facial expression classification (FEC) systems could help autistic and neurotypical people engage better.

While the potential of FEC systems has already been explored to some degree, Dr. Alexander Wong, a systems design engineering professor and Canada Research Chair in Artificial Intelligence and Medical Imaging at the University of Waterloo, believes there is always room for improvement ­— particularly when it comes to designing support systems for autistic people that are user-friendly and operate in real-time, real-world scenarios.

Innovation that works

Wong is constantly trying to identify areas where AI can have a positive impact, where innovation can deliver real-world operational impact.

“Theoretical AI is very important but not enough for me,” says Wong. “I’m an engineer so I’m always looking for ways to take research into the real world. I was introduced to the neurological patterns of autism through my medical imaging work and got hooked on the feasibility of developing AI to help those autistic people who struggle with reading emotional expression to feel more connected.

Wong leads a project called Emotion Discovery for Autistic Individuals that uses AI and deep learning (a branch of ML) to provide convenient tools that help those in need read and express emotions with greater understanding.

This project is one of many that falls within AI for Social Good, a research initiative funded by Microsoft and run through the Waterloo.AI Institute, that focuses on putting new technologies to good use for the benefit of society.

“Helping improve people's lives by empowering researchers with advanced technologies is one of the main goals for the AI for Good program from Microsoft.” Says Erin Chapple, corporate vice president for Azure Core product and design at Microsoft.

“It’s exciting to see Azure’s robust compute power and AI capabilities powering research projects like Alex’s to delve into emotion discovery for individuals who are autistic.”

Emotionally intelligent AI

Wong’s project uses AI to automatically recognize facial expressions based on visual information captured by a camera. He and his team employed deep learning to build

TimeConvNet and EmotionNet Nano, two deep neural networks that act as digital representations of the brain.

They fed the neural networks with hundreds of videos of people of all ages, genders and ethnicities, all expressing a range of emotions from the more obvious ones such as happy and sad, to the more nuanced ones like embarrassment and irritation.

“We explored the use of video to teach the system’s neurons and synapses how to tell the difference between one emotion and another,” explains Wong. “Facial expressions can change quickly during conversations and can even change within a single emotion — something static imagery can’t convey. Our AI had to be able to capture those important subtleties and deliver accurate feedback in real-time.”

Alexander Wong

Alexander Wong is a systems design engineering professor and Canada Research Chair in Artificial Intelligence and Medical Imaging at the University of Waterloo.

By telling the AI which emotions were being expressed by the changing facial expressions over time, the system learned how to identify single emotions as well as emotional progression with greater accuracy.

When compared to other digitally engineered neural networks specifically designed to read facial expressions for emotional recognition, Wong’s AI system outperforms the lot.

Tests show it to be smaller, faster, more accurate and more energy efficient than other options. To achieve such high performance, Wong and his team did something unique.

“We used AI to build an AI that meets our project’s operational requirements,” Wong says. “It made sense to use our expertise in AI to create a system fit for purpose from the get-go — rather than handcraft something into existence that typically always requires tinkering. Industrial collaboration and support make this kind of creative work possible.”

Seeing is believing

Wong’s AI system has been discussed at international research conferences and been the subject of four very well-received scientific articles. But the true test of any innovation is its applicability in the real world.

The next phase for Wong and his team is to test their technology among autistic individuals. To that end, they are now working on the development of AEGIS (Augmented-reality Expression Guided Interpretations System), an assistive platform that can deploy the AI on a variety of everyday devices including smartphones, smart glasses and video conference systems.

The whole point of assistive technologies is that they need to help people by making their lives easier, but not all do. Understandably, most autistic individuals are unwilling to lug around large, bulky and expensive devices as they go about their daily lives. Wong has created an AI that is so small it can, for example, be embedded on a chip to augment virtual reality (VR) glasses.

Another example is the system’s ability to augment video screens for enhanced recognition of facial expressions and their corresponding emotions. Let’s say two colleagues are having an online meeting and one of them is autistic. The AI can read the neurotypical colleague’s facial expressions during the video call and feed that information back to the autistic individual in the form of emoji pop ups.

“Emojis work really well here,” Wong says. “There is an ever-increasing range of emoji moods and faces, and they’re well used and understood by most people.”

Wong also plans to create learning videos that provide autistic audiences with ongoing emotional discovery outside of real-time social interactions.

Transparency and trust

These scenarios do present some deployment challenges that Wong is considering very carefully, such as privacy. For example, an autistic individual agrees to wear a pair of VR glasses augmented with the AI as it helps them engage with another person – but what about the other person whose emotions are being read and recorded by the AI?

Next step would be to engage with the autistic community to ensure that the tech delivers real value.

“While operationalization is the only way to really determine the value of our AI,” says Wong, “there is a lot of sensitive information involved that we need to navigate thoroughly and thoughtfully. Launching new tech with the best of intentions is not good enough. We need to work with autistic people interested in our tech to think out all its possible consequences and ramifications and mitigate the potential for risk. Only then can we go live.”