Kerstin Dautenhahn

Kerstin Dautenhahn
Professor, Faculty of Engineering
> Canada 150 Research Chair in                                              Intelligent Robotics

What makes a machine more than just a piece of hardware? Do we form relationships with our technology? How should we be teaching robots to act? And what are they teaching us?

These are just some of the questions that Kerstin Dautenhahn is exploring as the Canada 150 Research Chair in Intelligent Robotics. Dautenhahn joined the University of Waterloo’s Faculty of Engineering in 2018 to establish the new Social and Intelligent Robotics Research Laboratory.

But advancing state of the art of technology for social and intelligent robots is only part of her research agenda.

“Just because you can build something, doesn’t mean you should,” says Dautenhahn. “We need to understand people’s intentions and expectations towards robots and investigate possible consequences of the robots we build.”

Dautenhahn is one of the founders of the field of social robotics. Her research centres on advancing our understanding of fundamental principles of human-robot interaction and how robots contribute to real-world applications.

“Robots can make a useful contribution to society and to our well-being,” Dautenhahn explains. “We need to broaden our imagination on what a robot can look like and what tasks they can perform so that it complements the skills that people are good at and enjoy doing.”

Can a robot be an acceptable companion to a human?

This question gets to the heart of Dautenhahn’s research. She uses several companion robots for research in her lab and in Waterloo’s RoboHub, a unique facility that encourages multidisciplinary research to explore the potential of robotic technologies.

A companion robot is an autonomous machine capable of carrying out a task that is useful to a human and is performed in a socially acceptable way. This means that the robot is able to interact with humans in a conventional and helpful manner.

“I am particularly interested in applications of companion robots in therapy and education for children, and supporting people with dementia living in long-term care facilities and elderly persons living at home independently,” says Dautenhahn.

Her extensive accomplishments include breaking new ground with robot-assisted therapy for children with autism, who often find communication and social interaction overwhelming and unpredictable.

"Our aim is to make the child feel comfortable with the robot,” explains Dautenhahn. “The experience for many autistic children is to receive negative feedback. The robot makes predictable and positive responses which they can copy and learn from."

Play sessions with robots can have long-term benefits for a child because they learn social cues and can practice behaviour with the robot.

In Waterloo, Dautenhahn is developing a variety of companion robots for children with underserved needs such as autism. Future research will also explore how companion robots can help people with dementia, building off her previous work of supporting independent living for older adults. 

Why do we build robots to look and act like humans?

Dautenhahn explains that humans have a deep-rooted desire to imitate nature. We build our robots in our own image and give them human tasks. And the more human-like we make robots, the more we treat them like humans.

“The way we treat a robot does not change the nature of what they are,” says Dautenhahn. “They are machines. They are not people, even if we want them to be.”

Many people find it easy to personify and empathise with humanoid robots. We name them and use gendered pronouns. We may even feel sorry for them if they fall down or fail at a given task.

But she explains that we could be setting robots up to fail when we anthropomorphize them. Humans can become easily disappointed by a humanoid robot when it does not live up to the expectations placed on it. People are less likely to have such expectations of a more mechanical looking robot, such as a Roomba or Amazon Echo.

Dautenhahn’s robots are built with this in mind. They are not made to look perfectly human-like and the mechanical characteristics of the robots are exposed. She wants to provide these visual cues to the children and adults engaging with the robots to keep the interactions grounded in reality.

Can we trust robots?

For Dautenhahn, it is never a question of whether robots are good or bad.

“Robots are machines that are made by us. Any robot is more similar to my smartphone or my toaster than it is to you or me,” she says.

What keeps Dautenhahn up at night is how humans may use robots in the future. She does not distrust robots, she says the real question is whether we can trust the people who deploy them.

“It is important to keep having conversations around the legal, ethical and social issues of robots, in addition to tackling the technical challenges of building social and intelligent robots,” says Dautenhahn.

Waterloo researchers from across multiple disciplines are studying these issues at the Social and Intelligent Robotics Research Lab. Dautenhahn knows that robots can fill an important role in our society; we just need to broaden our imagination on what a robot can do and look like.

If we are to responsibly develop intelligent robotics, it is important to recognize both the limits of machines and the needs of our complex society. And a robot designed to assist children with autism is a glimmer of the best possible future for robotics.

Dautenhahn reminds us that the future of robotics is in our hands. “We build them and it is up to us to decide what they will become.”