Colloquium - Mayank R. Mehta (UCLA) Neurophysics of Space, Time and Memory

A picture of Mayank R. Mehta talking.
September 16, 2014
4:00PM - 5:00PM
1080 Physics Research Building - Smith Seminar Room - reception at 3:45pm in the Atrium

Date Range
2014-09-16 16:00:00 2014-09-16 17:00:00 Colloquium - Mayank R. Mehta (UCLA) Neurophysics of Space, Time and Memory All animals move in space as a function of time. Space and time are abstract concepts because they can neither be directly felt nor readily controlled. How does the brain, or the ensemble of neurons, create a perception of space and time? Individual neurons in a few key brain regions are selectively active a function of the subject’s position in space and past experience. The biophysical mechanisms governing such neural maps of space and time have remained elusive. Hence, we have developed a noninvasive, immersive and multisensory virtual reality system where precisely controlled stimuli determine the surrounding virtual space. Using this apparatus we have measured the activities of thousands of individual neurons, developed analysis techniques to decipher the emergent neural ensemble dynamics, and generated computational models that can explain this neural dynamics as a function of virtual space. This integrative approach has provided a neuro-physical understanding of how ensembles of neurons generate basic concepts such as position, relative distance, speed, continuity and memory. Surprisingly, internally generated, neural oscillations are crucial for these computations. 1080 Physics Research Building - Smith Seminar Room - reception at 3:45pm in the Atrium America/New_York public

All animals move in space as a function of time. Space and time are abstract concepts because they can neither be directly felt nor readily controlled. How does the brain, or the ensemble of neurons, create a perception of space and time? Individual neurons in a few key brain regions are selectively active a function of the subject’s position in space and past experience. The biophysical mechanisms governing such neural maps of space and time have remained elusive. Hence, we have developed a noninvasive, immersive and multisensory virtual reality system where precisely controlled stimuli determine the surrounding virtual space. Using this apparatus we have measured the activities of thousands of individual neurons, developed analysis techniques to decipher the emergent neural ensemble dynamics, and generated computational models that can explain this neural dynamics as a function of virtual space. This integrative approach has provided a neuro-physical understanding of how ensembles of neurons generate basic concepts such as position, relative distance, speed, continuity and memory. Surprisingly, internally generated, neural oscillations are crucial for these computations.