Imagine home-based technology that not only provides daily prompts for older adults living with Alzheimer’s disease - but is also emotionally intelligent and understands how to best convince a person to do something in a given situation.

“We realized after years of designing artificially intelligent systems that we can solve the technical problems of an automated prompting system,” says Jesse Hoey, a professor in University of Waterloo’s David R. Cheriton School of Computer Science. “But convincing a person with Alzheimer’s disease to do something is much more difficult. It depends on the person’s interpretation of the situation — what they think is going on, what they perceive is happening, what cues they’re getting — and, importantly, who they think they are.”

Hoey’s cognitive assistant prototype, known as ACT@Home, goes beyond just cues and prompts to interpret and understand the dynamics of human interaction and respond accordingly. “It’s not just when, but how you prompt a person,” says Hoey. “ACT@Home works by modelling what’s going on in the mind of someone with Alzheimer’s disease when they are prompted to do something.”

As Alzheimer’s disease progresses it causes problems with reasoning and behaviour and impairs a person’s ability to complete activities of daily living independently. Hoey explains that people with Alzheimer’s disease may start to wash their hands, then forget what they’re doing, why they’re doing it, lose motivation, forget what they’ve done so far and repeat steps, or just stop partway through. “The person they live with usually steps in to help, but the amount of assistance required can become overwhelming,” he says.

ACT@Home is an automated assistance technology that combines artificial intelligence with mathematical models of Affect Control Theory, a sociological framework that proposes people conduct themselves in a way as to generate feelings that are appropriate to the situation or context.

An elderly man with Alzheimer’s disease will interact with others based on his feelings of who he is and who he thinks others are. He will interact differently with a friend than a spouse or doctor. “Humans adjust their interactions based on who the other person is or, more precisely, who the other person feels like. We all engage in this little theatrical performance,” Hoey says.

The interesting discovery is that this performance is not lost completely in people with Alzheimer’s disease. Alzheimer’s disease disrupts a person’s comprehension of the way the world works — for example, whether to take pill A or pill B before bedtime, or to dry one’s hands after they’ve been washed. But a person with Alzheimer’s still remembers what it feels like to interact with others, he explained.

Hoey and his collaborators have been conducting qualitative interviews of people with Alzheimer’s and their caregivers to better understand what it means to be a person with the disease and how they interact with others.

“Our goal is to take the findings from these interviews to develop an inexpensive home-based emotionally intelligent cognitive assistant that helps people with Alzheimer’s disease while lightening the burden on their caregivers,” says Hoey.

The research was conducted in collaboration with the Schlegel-UW Research Institute for Aging, and was supported by AGE-WELL NCE Inc., a member of the Networks of Centres of Excellence program, and the American Alzheimer’s Society.

Image of AI head, a computerized woman with brown hair and eyes wearing a collared shirt

Virtual Human image supplied by the University of Colorado: Nattawut Ngampatipatpong, Sarel Van Vuuren, and Robert Bowen. Animation prompting platform supported in part by grant from the National Institute on Disability, Independent Living, and Rehabilitation Research (NIDILRR).