ExoNet Database: Wearable Camera Images of Human Locomotion Environments

Title ExoNet Database: Wearable Camera Images of Human Locomotion Environments
Author
Keywords
Abstract

<p>Advances in computer vision and deep learning are allowing researchers to develop environment recognition systems for robotic exoskeletons and prostheses. However, small-scale and private training datasets have impeded the development and dissemination of image classification algorithms for classifying human walking environments. To address these limitations, we developed ExoNet - the first open-source, large-scale hierarchical database of high-resolution wearable camera images of human locomotion environments. Unparalleled in scale and diversity, ExoNet contains over 5.6 million images of indoor and outdoor real-world walking environments, which were collected using a lightweight wearable camera system throughout the summer, fall, and winter seasons. Approximately 923,000 images in ExoNet were human-annotated using a 12-class hierarchical labelling architecture. Available publicly through IEEE DataPort, ExoNet offers an unprecedented shared platform to train, develop, and compare next-generation image classification algorithms for human locomotion environment recognition. Besides the control of exoskeletons and prostheses, applications of ExoNet could extend to humanoids and autonomous legged&nbsp;robots.&nbsp;</p>

Year of Publication
2020
Journal
Frontiers in Robotics and Artificial Intelligence
Volume
7
Date Published
12/2020
URL
https://www.frontiersin.org/articles/10.3389/frobt.2020.562061/full
DOI
10.3389/frobt.2020.562061
Download citation