IDEaS Theoretical Neuroscience Seminar Series | Dimension of Activity In Random Feedforward Networks And Cerebellum-Like Systems

Talks Overview: Neural networks are high-dimensional systems whose activity forms a basis for learning and memory. Measured activity in biological and artificial neural networks does not uniformly fill the space of all possible activity patterns, instead being constrained to low-dimensional manifolds whose structure is related both to the architecture of the network and the nature of the inputs it receives. I will introduce the notion of the linear embedding dimension as a useful metric for describing neural network activity and discuss its relationship with learning. I will describe work that we have done in feedforward networks computing this quantity and relating it to generalization performance for learning tasks, and the anatomical organization of cerebellum-like systems. I will then describe recent work in which we have begun to analyze the dimension of random recurrent networks in the chaotic state.

Speaker Webpage: http://lk.zuckermaninstitute.columbia.edu/

Host: Hannah Choi

Event categories