ExoNet Database: Wearable Camera Images of Human Locomotion Environments

TitleExoNet Database: Wearable Camera Images of Human Locomotion Environments
Publication TypeJournal Article
Year of Publication2020
AuthorsLaschowski, B., W. McNally, A. Wong, and J. McPhee
JournalFrontiers in Robotics and Artificial Intelligence
Date Published12/2020
KeywordsArtificial Intelligence, Biomechatronics, Computer Vision, Deep Learning, Environment Recognition, Exoskeletons, Prosthetics, Rehabilitation Robotics, Wearable Technology

Advances in computer vision and deep learning are allowing researchers to develop environment recognition systems for robotic exoskeletons and prostheses. However, small-scale and private training datasets have impeded the development and dissemination of image classification algorithms for classifying human walking environments. To address these limitations, we developed ExoNet - the first open-source, large-scale hierarchical database of high-resolution wearable camera images of human locomotion environments. Unparalleled in scale and diversity, ExoNet contains over 5.6 million images of indoor and outdoor real-world walking environments, which were collected using a lightweight wearable camera system throughout the summer, fall, and winter seasons. Approximately 923,000 images in ExoNet were human-annotated using a 12-class hierarchical labelling architecture. Available publicly through IEEE DataPort, ExoNet offers an unprecedented shared platform to train, develop, and compare next-generation image classification algorithms for human locomotion environment recognition. Besides the control of exoskeletons and prostheses, applications of ExoNet could extend to humanoids and autonomous legged robots. 

Related files: