Wearable technology provides a convenient way to collect human motion data in daily-living conditions. For example, wearable bands or smartwatches (e.g., Apple watch, Samsung gear, Fitbit) are widely-used for the purpose of tracking user's physical activities. However, the ability to understand human motion through a wearable device is limited to simple tasks (e.g., counting steps) because it is challenging to capture the large variability of human motion using a small number of wearable sensors.
In this research, we dive into wearable sensor data and aim at providing practical solutions for fitness training and healthcare using wearable technology.
Most gait analysis in clinical settings is based on visual observation which lacks sensitivity; whereas more advanced gait analysis techniques are typically too expensive and time-consuming.
The goal of this work is to develop methods and tools which will enable reliable dynamic balance and gait for bipedal robots. Currently, the focus of this project is on the development and application of methods for quantifying and measuring various kinematic and dynamic properties of the robot which are critical to dynamic bipedal gait and balance.
While autonomous robots are finding increasingly widespread application, specifying robot tasks usually requires a high level of expertise. In this work, the focus is on enabling a broader range of users to direct autonomous robots by designing human-robot interfaces that allow non-expert users to set up complex task specifications. To achieve this, we investigate how user preferences can be learned through human-robot interaction (HRI).
Living Architecture Systems are distributed large-scale immersive architectural spaces with qualities that come strikingly close to those of living systems, aiming to move, respond, learn and renew themselves with chemical exchanges, and to be adaptive and empathic toward their inhabitants.
This research was conducted at the CARIS Laboratory at the University of British Columbia, working with Professor Elizabeth Croft. The goal of this research was to develop a human-robot interaction strategy that ensures the safety of the human participant through planning and control. We focused on quantifying the level of danger present in the interaction, and then acting to minimize that danger, both in the planning stage and during real-time control.
Model-based control offers numerous advantages but requires accurate representation of the system dynamics. It is difficult to formulate an analytical model which fully captures all the complex nonlinear dynamics such as friction and backlash.
The aim of this project is to complete the electromechanical design and dynamic modeling of a biped humanoid robot in collaboration with Quanser. Our goal is to build a highly robust platform which can be used for a variety of research applications for the Adaptive Systems Laboratory.