Lecturer: Terry Stewart
Fields: Computational Neuroscience / Machine Learning
Content
It is tricky to get neural networks to efficiently represent values that change over time, or to represent arbitrary spatial shapes and locations. In this talk, we look at two methods for representing time and space, using Legendre Polynomials and Circular Convolution, respectively. We show that they not only have desirable computational properties, but also map well onto Time Cells and Place Cells in the brain.
Literature
- Voelker, Kajić, Eliasmith (2019) “Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks” NeurIPS 2019 http://compneuro.uwaterloo.ca/files/publications/voelker.2019.lmu.pdf
- Dumont, Eliasmith (2020) “Accurate representation for spatial cognition using grid cells” CogSci 2020 http://compneuro.uwaterloo.ca/files/publications/dumont.2020.pdf
Lecturer
Terry Stewart is an Associate Research Officer at National Research Council Canada, working on developing large-scale brain simulations and finding industry applications of such systems using energy-efficient neuromorphic hardware. Previously, Terry was a post-doctoral research associate working with Chris Eliasmith at the Centre for Theoretical Neuroscience at the University of Waterloo. Terry started as an engineer (B.A.Sc. in Systems Design Engineering, University of Waterloo, 1999), did a masters applying experimental psychology on simulated robots (M.Phil. in Computer Science and Artificial Intelligence, University of Sussex, 2000), then a Ph.D. was on cognitive modelling (Ph.D. in Cognitive Science, Carleton University, 2007). Terry is also a co-founder of Applied Brain Research, a research-based start-up company based around using low-power hardware (neuromorphic computer chips) and adaptive neural algorithms.
Affiliation: National Research Council Canada
Homepage: http://terrystewart.ca