Oral Session 3

Memory Reactivation in Awake and Sleep States – By introducing arrays of microelectrodes into hippocampal, thalamic, and neocortical areas of freely behaving rodents, we have characterized the detailed structure and content of memory patterns across ensembles of individual neurons as they are formed during spatial behavior, and reactivated during quiet wakefulness, and sleep. I will describe the contributions of these brain systems to the expression and coordination of memory reactivation, including recent results demonstrating the ability to influence reactivated memory content during sleep.

Correlations strike back (again): the case of associative memory retrieval – It has long been recognised that statistical dependencies in neuronal activity need to be taken into account when decoding stimuli encoded in a neural population. Less studied, though equally pernicious, is the need to take account of dependencies between synaptic weights when decoding patterns previously encoded in an auto-associative memory. We show that activity-dependent learning generically produces such correlations, and failing to take them into account in the dynamics of memory retrieval leads to catastrophically poor recall. We derive optimal network dynamics for recall in the face of synaptic correlations caused by a range of synaptic plasticity rules. These dynamics involve well-studied circuit motifs, such as forms of feedback inhibition and experimentally observed dendritic nonlinearities. We therefore show how addressing the problem of synaptic correlations leads to a novel functional account of key biophysical features of the neural substrate.

A memory frontier for complex synapses – An incredible gulf separates theoretical models of synapses, often described solely by a single scalar value denoting the size of a postsynaptic potential, from the immense complexity of molecular signaling pathways underlying real synapses. To understand the functional contribution of such molecular complexity to learning and memory, it is essential to expand our theoretical conception of a synapse from a single scalar to an entire dynamical system with many internal molecular functional states. Moreover, theoretical considerations alone demand such an expansion; network models with scalar synapses assuming finite numbers of distinguishable synaptic strengths have strikingly limited memory capacity. This raises the fundamental question, how does synaptic complexity give rise to memory? To address this, we develop new mathematical theorems elucidating the relationship between the structural organization and memory properties of complex synapses that are themselves molecular networks. Moreover, in proving such theorems, we uncover a framework, based on first passage time theory, to impose an order on the internal states of complex synaptic models, thereby simplifying the relationship between synaptic structure and function.

Speaker Details

Matthew Wilson received his bachelor’s degree in Electrical Engineering from Rensselaer Polytechnic Institute in 1983, his master’s degree in Electrical Engineering from the University of Wisconsin, Madison in 1986, and his Ph.D. in Computation and Neural Systems from the California Institute of Technology in 1990. In 1991 he began his work studying the formation of memory in the hippocampus using large-scale multiple electrode recording of neuronal ensembles in the hippocampus of freely behaving rats at the University of Arizona, Tucson. He continues to study the mechanisms of memory formation in the rodent at the Massachusetts Institute of Technology as a member of the faculty of the Departments of Brain and Cognitive Sciences, and Biology, the Picower Center for Learning and Memory, and the RIKEN-MIT Neuroscience Research Center.

Date:
Speakers:
Matthew Wilson, Cristina Savin, and Surya Ganguli
Affiliation:
MIT, University of Cambridge, Stanford University
    • Portrait of Jeff Running

      Jeff Running