Computer scientists develop Typealike system for expanded laptop interaction

Wednesday, January 5, 2022

Nalin Chhibber at laptop
A team of computer scientists has developed a new system that recognizes near-keyboard hand gestures to expand laptop interaction.

The new technology is an innovation in the field of human-computer interaction and allows users to give commands that would otherwise involve keyboard shortcuts or mouse round-trips.

Typealike was created by a team of current and former graduate researchers with the Cheriton School of Computer Science, including Nalin Chhibber, a recent master’s grad, Hernamt Surale, a recent PhD grad, and Fabrice Matulic, senior researcher at Preferred Networks Inc. and a former postdoctoral researcher at Waterloo. The team was supervised by Daniel Vogel, an associate professor of computer science and Cheriton Faculty Fellow.

“It started with a simple idea about new ways to use a webcam,” said Chhibber. “The webcam is pointed at your face, but the most interaction happening on a computer is around your hands. So, we thought, what could we do if the webcam could pick up hand gestures?”

The initial insight led to the development of a small mechanical attachment that redirects the webcam downwards towards the hands. The team then created a software program capable of understanding distinct hand gestures in variable conditions and for different users. The team used machine learning techniques to train the Typealike program.

“It’s a neural network, so you need to show the algorithm examples of what you’re trying to detect,” said Matulic. “Some people will make gestures a little bit differently, and hands vary in size, so you have to collect a lot of data from different people with different lighting conditions.”

The team recorded a database of hand gestures with dozens of research volunteers. They also had the volunteers do tests and surveys to help the team understand how to make the program as functional and versatile as possible.

“We’re always setting out to make things people can easily use,” said Vogel. “People look at something like Typealike, or other new tech in the field of human-computer interaction, and they say it just makes sense. That’s what we want. We want to make technology that’s intuitive and straightforward, but sometimes to do that takes a lot of complex research and sophisticated software.”

Learn more about Typealike on the project Github page, or read the feature article in Waterloo News.

Remote video URL