Evolutionary Deep Intelligence

Deep learning has shown considerable promise in recent years, producing tremendous results and significantly improving the accuracy of a variety of challenging problems when compared to other machine learning methods. However, they require high performance computing systems (such as supercomputer clusters and GPU arrays) due to their highly complex and large computational architectures. Additionally, deep neural networks require machine learning experts to delicately design and fine-tune the large, complex architectures. This issue of complexity has increased greatly over time, driven by the demand for increasingly deeper and larger networks to boost cognitive accuracy. As such, it has become near impossible to take advantage of such powerful yet complex deep neural networks in scenarios where computational and energy resources are scarce, such as in embedded systems, as well as increasingly more difficult to hand-craft their architectures. Inspired by nature, the team at VIP lab have developed several pioneering strategies for enabling powerful yet operational deep intelligence by considering a radically different idea: Can deep neural networks evolve naturally over generations to become not only highly efficient but also powerful?


Deep Evolution

We have introduced the concept of evolutionary deep intelligence, where we evolve deep neural networks over multiple generations to become more efficient yet smart. The 'DNA' of each generation of deep neural networks is encoded computationally and used, along with simulated environmental factors such as those encouraging computational and energy efficiency through natural selection, to 'give birth' to its offspring deep neural networks, with the process repeating generation after generation. These 'evolved' offspring deep neural networks will naturally have more efficient, more varied architectures than their ancestor deep neural networks (due to natural selection and random mutations) while achieving powerful cognitive capabilities. 

Experimental results from a study using the MSRA-B and HKU-IS datasets demonstrated that the synthesized offspring deep neural networks can achieve state-of-the-art F-beta scores while having network architectures that are significantly more efficient, with a staggering ~48X fewer synapses by the fourth generation compared to the original, first-generation ancestor network.   This level of performance was further reinforced by experimental results from a study using the MNIST dataset, which demonstrated synthesized offspring deep neural networks can achieve state-of-the-art accuracy (>99%) while having network architectures that are significantly more efficient, with a staggering ~40X fewer synapses by the seventh generation compared to the original, first-generation ancestor network. More remarkably, an accuracy of ~98% was still achieved by thirteen-generation offspring deep neural networks with an incredible ~125X fewer synapses compared to the original, first-generation ancestor network. 

The concept of evolutionary deep intelligence has won numerous awards, including a Best Paper Award at the NIPS Workshop on Efficient Methods for Deep Neural Networks, a Best Paper Award at the Conference on Computational Vision and Intelligence Systems, named by MIT Technology Review as one of the most interesting and thought-provoking papers on arXiv, and named on Reddit as one of the papers that demonstrate the beauty of deep learning.

Related people

Directors

Alexander Wong

Students

M J Shafiee

​Alumni

Akshaya Mishra

​Parthipan Siva

Related publications

1. M. Shafiee, A. Mishra, and A. Wong, Deep Learning with Darwin: Evolutionary Synthesis of Deep Neural Networks. Neural Processing Letters, 2016. (chosen by MIT Technology Review as one of the most interesting and thought-provoking papers from the Physics arXiv for the week of June 25, 2016 click here)


2. M. Shafiee and A. Wong, Evolutionary Synthesis of Deep Neural Networks via Synaptic Cluster-driven Genetic Encoding. NIPS Workshop on Efficient Methods for Deep Neural Networks, 2016. (received Best Paper Award)

 
3. M. Shafiee, E. Barshan, and A. Wong, Evolution in Groups: A deeper look at synaptic cluster driven evolution of deep neural networks. Future Technologies Conference (FTC), 2017. 

 

4. M. Shafiee, A. Chung, F. Khalvati, M. Haider, and A. Wong, Discovery Radiomics via Evolutionary Deep Radiomic Sequencer Discovery for Pathologically-Proven Lung Cancer Detection. Journal of Medical Imaging, 2017.


5. A. Chung, M. Shafiee, P. Fieguth, and A. Wong, The Mating Rituals of Deep Neural Networks: Learning Compact Feature Representations through Sexual Evolutionary Synthesis. IEEE International Conference on Computer Vision (ICCV) Workshops, 2017.


6. A. Chung, P. Fieguth, and A. Wong, Polyploidism in Deep Neural Networks: m-Parent Evolutionary Synthesis of Deep Neural Networks in Varying Population Sizes. Journal of Computational Vision and Imaging Systems, 2017.


7. M. Shafiee, E. Barshan, F. Li, B. Chwyl, M. Karg, C. Scharfenberger, and A. Wong. Learning Efficient Deep Feature Representations via Transgenerational Genetic Transmission of Environmental Information during Evolutionary Synthesis of Deep Neural Networks. IEEE International Conference on Computer Vision (ICCV) Workshops, 2017.


8. M. Shafiee, F. Li, B. Chwyl, and A. Wong, Fast YOLO: A Fast You Only Look Once System for Real-time Embedded Object Detection in Video. Journal of Computational Vision and Imaging Systems, 2017.


9. M. Shafiee, F. Li, and A. Wong, Exploring the Imposition of Synaptic Precision Restrictions for Evolutionary Synthesis of Deep Neural Networks. Conference on Cognitive Computational Neuroscience, 2017. 


10. K. Kasiri, M. Shafiee, F. Li, J. Eichel, and A. Wong, Efficient Deep Network Architecture for Vision-Based Vehicle Detection. Journal of Computational Vision and Imaging Systems, 2017. 


11. M. Shafiee, F. Li, B. Chwyl, and A. Wong, SquishedNets: Squishing SqueezeNet further for edge device scenarios via deep evolutionary synthesis. NIPS Workshop on Machine Learning on the Phone and other Consumer Devices, 2017. 


12. M. Shafiee, F. Li, B. Chwyl, R. Chen, M. Karg, C. Scharfenberger, and A. Wong. StressedNets: Efficient Feature Representations via Stress-induced Evolutionary Synthesis of Deep Neural Networks. arXiv, 2018.


13. A. Chung, P. Fieguth, and A. Wong, Nature vs. Nurture: The Role of Environmental Resources in Evolutionary Deep Intelligence. Conference on Robotic Vision (CRV), 2018. 

14. A. Chung, P. Fieguth, and A. Wong, Polyploidism in Deep Neural Networks: m-Parent Evolutionary Synthesis of Deep Neural Networks in Varying Population Sizes. Journal of Computational Vision and Imaging Systems, 2018. (received Best Paper Award)

15. A. Chung, P. Fieguth, and A. Wong, Mitigating Architectural Mismatch During the Evolutionary Synthesis of Deep Neural Networks. NIPS Workshop on Meta Learning, 2018.