PhD Seminar • Human-Computer Interaction • Pervasive Desktop Computing by Direct Manipulation of an Augmented Lamp

Wednesday, September 18, 2024 9:30 am - 10:30 am EDT (GMT -04:00)

Please note: This PhD seminar will take place in DC 2314 and online.

Yuan Chen, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Daniel Vogel

Desktop computing, despite its long-standing dominance in personal productivity, remains largely confined to screens. Many efforts to expand beyond a single screen, from multiple monitors to incorporating projector-camera units or head-mounted displays, have shown promise. However, this is often from the desktop display to other devices and it lacks the awareness of physical environments and user activities. This talk explores a novel form of direct manipulation projector-camera system, which leverages unique characteristics of physical lamp movement to manipulate content to and from the desktop display, but also to and from devices and the physical environment, while maintaining the awareness in the workspace.

Three projects examine the design, prototyping, and human factors aspects of an augmented lamp system in which the lamp works as an input and output device connecting desktop computing and physical environment. In the first project, an interaction design space is introduced for physical direct manipulation using an architect lamp with a proof-of-concept system using a projector and motion tracking system. We demonstrate its potential usage through three scenarios, describe study results evaluating its potential, and summarize design implications. In the second project, we study the impact on user performance and interaction strategies when interacting with an augmented lamp in a desktop space. We conduct a controlled experiment in Virtual Reality to examine two control mechanisms for target acquisition tasks in a dynamic peephole display: “coupled”, when the display centre is used for selection, and “decoupled”, when the selection is handled by separate inputs like direct touch. We find that the two control mechanisms have subtle differences in total time and error, but people follow different strategies for coordinating the movement of a dynamic peephole display with different target acquisition techniques. In the third project, we explore the latter observation in a more general context. Using a controlled Virtual Reality environment, we conduct an experiment to investigate whether what users intend to do with a virtual target impacts how they plan and perform the initial target acquisition. Our results lead to an understanding of user motion profiles before acquisition for different intended interactions with the same target. We discuss how these motion profiles can then be used to improve the lamp design, such as integrating force sensors into the lamp to improve activity awareness. Together, these findings establish a promising way to connect current desktop computing with the surrounding physical desktop environment based on a deeper understanding of user activities in that space.


To attend this PhD seminar in person, please go to DC 2314. You can also attend virtually using Zoom.