Lecturer: Terry Stewart
Fields: Computational Neuroscience / Neuromorphic Computing
Content
While the brain does perform some sort of computation to produce cognition, it is clear that this sort of computation is wildly different from traditional computers, and indeed also wildly different from traditional machine learning neural networks. In this course, we identify the type of computation that biological neurons are good at (in particular, dynamical systems), and show how to build large-scale neural models that compute basic aspects of cognition (sensorimotor, memory, symbolic reasoning, action selection, learning, etc.). These models can either be made to be biologically realistic (to varying levels of detail) or mapped onto energy-efficient neuromorphic hardware.
Literature
- Eliasmith, C. and Anderson, C. (2003). Neural engineering: Computation, representation, and dynamics in neurobiological systems. MIT Press, Cambridge, MA.
- Eliasmith, C. et al., (2012). A large-scale model of the functioning brain. Science, 338:1202-1205.
- Kajić, I. et al., (2017). A spiking neuron model of word associations for the remote associates test. Frontiers in Psychology, 8:99.
- Stöckel, A. et al., (2021). Connecting biological detail with neural computation: application to the cerebellar granule-golgi microcircuit. Topics in Cognitive Science, 13(3):515-533.
Lecturer
Terry Stewart is a Senior Research Officer at the National Research Council Canada, after receiving his PhD in Cognitive Science at Carleton University and ten years as a post-doc in the Centre for Theoretical Neuroscience at the University of Waterloo. His research is on how neural systems compute, involving both building large-scale neural simulations of cognitive behaviour and the implementation of these model in energy-efficient neuromorphic hardware.
Affiliation: National Research Council Canada
Homepage: http://terrystewart.ca