|Title||Comparative Analysis of Environment Recognition Systems for Control of Lower-Limb Exoskeletons and Prostheses|
|Publication Type||Conference Paper|
|Year of Publication||2020|
|Authors||Laschowski, B., W. McNally, A. Wong, and J. McPhee|
|Conference Name||IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob)|
|Conference Location||New York City, NY, USA|
|Keywords||Artificial Intelligence, Biomechatronics, Computer Vision, Environment Recognition, Exoskeletons, Image Classification, Prostheses, Rehabilitation, Wearable Technology|
Environment recognition systems can facilitate the predictive control of lower-limb exoskeletons and prostheses by recognizing the oncoming walking environment prior to physical interactions. While many environment recognition systems have been developed using different wearable technology and classification algorithms, their relative operational performances have not been evaluated. Motivated to determine the state-of-the-science and propose future directions for research innovation, we conducted an extensive comparative analysis of the wearable technology, training datasets, and classification algorithms used for vision-based environment recognition. The advantages and drawbacks of different wearable cameras and training datasets were reviewed. Environment recognition systems using pattern recognition, machine learning, and convolutional neural networks for image classification were compared. We evaluated the performances of different deep learning networks using a novel balanced metric called “NetScore”, which considers the image classification accuracy, and computational and memory storage requirements. Based on our analysis, future research in environment recognition systems for lower-limb exoskeletons and prostheses should consider developing 1) efficient deep convolutional neural networks for onboard classification, and 2) large-scale open-source datasets for training and comparing image classification algorithms from different researchers.