
Ordering coffee with your feet
New research suggests variations in walking could be enough to tell augmented reality what you want
New research suggests variations in walking could be enough to tell augmented reality what you want
By Media RelationsImagine controlling apps with your feet while you walk. This concept is the focus of new research which explores using gait gestures – intentional variations in how you walk – as controls for augmented reality (AR) devices.
“There’s a long history of using feet to control machines. For example, the pedals on the car, but very little research has been done into using the way we walk as an input for a device,” said Ching-Yi Tsai, the lead author on the study and a former visiting scholar in the University of Waterloo David R. Cheriton School of Computer Science.
Click the image to watch a YouTube video explaining how different walking movements can be used as an input method.
The idea emerged during the pandemic when Waterloo professor of computer science Daniel Vogel, frustrated by having to stop and use his phone with cold fingers while walking to get coffee, wondered if there could be a way to place orders without pausing. This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
“Extreme movements like dance steps or a jump would likely be easy for a system to recognize, but these might be harder to perform, and they would deviate too far from normal walking for people to feel comfortable doing them in public,” Vogel said. “We didn’t want users to feel like someone from Monty Python’s Ministry of Silly Walks!”
The research identified seven optimal gait gestures. In a follow-up study, participants used an AR headset displaying a simple menu overlaid with the real world. They tested these gestures to operate a music player, order coffee, and answer calls. The team remotely triggered commands, as the corresponding AR technology is still in development. A proof-of-concept recognizer was also created, achieving 92 per cent accuracy in identifying the gestures.
“We aren’t at a point yet where AR headsets are widely used,” Tsai said. “But this research shows that if we get there, this input option has got legs!”
The study, Gait gestures: examining stride and foot strike variation as an input method while walking, authored by Tsai, Vogel and Waterloo researchers Ryan Yen and Daekun Kim was recently published in the proceedings of UIST 2024.
Read more
Computer Agent Arena builds stronger AI models by assessing its ability to perform real-world tasks like web browsing and coding
Read more
The University of Waterloo fosters innovation through bold, unconventional research, driving future-focused solutions to both local and global challenges
Read more
The contribution supports the Computer Research Endowment at the University of Waterloo
The University of Waterloo acknowledges that much of our work takes place on the traditional territory of the Neutral, Anishinaabeg, and Haudenosaunee peoples. Our main campus is situated on the Haldimand Tract, the land granted to the Six Nations that includes six miles on each side of the Grand River. Our active work toward reconciliation takes place across our campuses through research, learning, teaching, and community building, and is co-ordinated within the Office of Indigenous Relations.
Select 'Accept all' to agree and continue. You consent to our cookies if you continue to use this website.