Jeff Orchard

Cheriton School of Computer Science

Computational Perception: How a brain understands its environment

By Nicolas Huguet, CBB Biographer
July 20, 2015

Many scientists have thought of perception as a one way process. Vision, for example, has traditionally been modeled as a feed-forward process with photons entering the eye and then information being sent to higher levels of the brain. But, in fact, there’s a theory that says that there are also predictions coming down from higher brain areas to the lower levels of this perception hierarchy, so-called feed-back. These predictions not only affect how the sensory system senses its surroundings but also how the information is later analyzed by the different levels of the hierarchy. The sensory system enables the brain to validate or invalidate its predictions.

Prof. Jeff Orchard, a core member of the Centre for Theoretical Neuroscience at the University of Waterloo and member of the CBB, is currently modelling this neurological process in a hierarchical learning neural network. The applications of his research to machine vision and machine learning look very promising. In the current state of machine vision, it can be quite difficult or even impossible for an image processing algorithm to recognize certain complex objects in an image. But the brain does it with simple algorithms and for a very large number of objects so finding out how these algorithms work would unlock new possibilities for machine vision.

Prof. Orchard says the University of Waterloo is renowned for its expertise in designing and simulating large-scale neural networks that perform cognitive tasks such as counting, adding, and pattern completion. This synergistic environment and other ongoing projects that will tell us more about the connections in the brain such as the Connectome Project will most likely help Prof. Orchard further advance his own research.

[Contact Information]