Description:
Robots are becoming increasingly pervasive in our daily lives; they range from simple toys to more advanced social robots that are used as educational tools and even as companions. Despite the various preliminary evidence that child-robot interaction could benefit in supporting children's education and development, concerns about data security and privacy, as well as potential risks associated with robotics, have been raised
In this project, our objective is to conduct an initial investigation on the perceptions of security and privacy risks associated with robots that interact with children. This will be carried out via user studies with families, where the children and their parents interact with a state-of-the-art educational humanoid robot, NAO, in moderated and controlled lab sessions.
The objectives of the project are:
- Understand how children (ages 7–13) comprehend and perceive security and privacy risks associated with a humanoid robot
- identify similarities and differences between children and parents’ perceptions of security and privacy risks in interactive robots.
The goal is to derive design recommendations and guidelines to improve the design of security and privacy mechanisms to support the preferences and needs of families using robots for education and entertainment purposes.