MASc Seminar - Daniel Fischer

Wednesday, July 24, 2013 11:00 am - 11:00 am EDT (GMT -04:00)

Speaker

Daniel Fischer

Title

Video See-Through Mobile Augmented Reality Using Position Based Visual POSE Estimation

Abstract

A technique for real time object tracking in a mobile computing environment and its application to video see-through Augmented Reality (AR) has been designed, verified through simulation, and implemented and validated on a mobile computing device.  Using position based visual POSE estimation methods and the Extended Kalman Filter (EKF), it is shown how this technique lends itself to be flexible to tracking multiple objects and multiple object models using a single monocular camera on different mobile computing devices.  Using the monocular camera of the mobile computing device, feature points of the object(s) are located through image processing on the display.  The relative position and orientation between the device and the object(s) is determined recursively by an EKF process.  Once the relative position and orientation is determined for each object, three dimensional AR image(s) are rendered onto the display as if the device is looking at the virtual object(s) in the real world.   This application and the framework presented could be used in the future to overlay additional informational onto displays in mobile computing devices.  Example applications include robotic aided surgery where animations could be overlaid to assist the surgeon, in training applications that could aid in operation of equipment or in search and rescue operations where critical information such as floor plans and directions could be virtually placed onto the display.

Supervisors

William J. Wilson and David W. Wang