PhD Seminar • Human-Computer Interaction • Exploring Uni-manual Around Ear Off-Device Gestures for Earables

Thursday, September 19, 2024 3:00 pm - 4:00 pm EDT (GMT -04:00)

Please note: This PhD seminar will take place online.

Shaikh Shawon Arefin Shimon, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Jian Zhao

Small form factor limits physical input space in earable (i.e., ear-mounted wearable) devices. Off-device earable inputs in alternate mid-air and on-skin around-ear interaction spaces using uni-manual gestures can address this input space limitation. Segmenting these alternate interaction spaces to create multiple gesture regions for reusing off-device gestures can expand earable input vocabulary by a large margin. Although prior earable interaction research has explored off-device gesture preferences and recognition techniques in such interaction spaces, supporting gesture reuse over multiple gesture regions needs further exploration.

We collected and analyzed 7560 uni-manual gesture motion data from 18 participants to explore earable gesture reuse by segmentation of on-skin and mid-air spaces around the ear. Our results show that gesture performance degrades significantly beyond 3 mid-air and 5 on-skin around-ear gesture regions for different uni-manual gesture classes (e.g., swipe, pinch, tap). We also present qualitative findings on most and least preferred regions (and associated boundaries) by end-users for different uni-manual gesture shapes across both interaction spaces for earable devices. Our results complement earlier elicitation studies and interaction technologies for earables to help expand the gestural input vocabulary and potentially drive future commercialization of such devices.


Attend this PhD seminar on Zoom.