Physics Colloquium - Vijay Balasubramaniam (University of Pennsylvania) "How smart can you get?: Computational efficiency in neural circuits"

November 8, 2011
4:00 pm - 5:00 pm
Smith Seminar Room

Date Range
2011-11-08 16:00:00 2011-11-08 17:00:00 Physics Colloquium - Vijay Balasubramaniam (University of Pennsylvania) "How smart can you get?: Computational efficiency in neural circuits" How smart can you get?: Computational efficiency in neural circuitsVijay Balasubramanian University of PennsylvaniaThe human brain weighs about 1.5 kilograms, and is made up of 100 billion neurons intricately connected via a network so dense that every cubic millimeter of the brain contains 4 kilometers of wire. The electrical activity of this incredibly complex network of neurons makes up human thought, and allows us to move, perceive, talk, plan, love, and do physics, all with a subtlety and precision that escapes the most powerful computers. And to do all this the brain uses only as much power as a refrigerator light bulb! I will discuss the strategies employed by neural circuits (e.g. specialization of function) to efficiently organize the use of power, space and other resources. Using the example of the circuits in the retina, I will argue that efficiency of computation can give a "theory of design" explaining key aspects of the observed architecture. Finally I will argue that there is a law of diminishing returns on these strategies. This law affects our ability to get "smarter" by evolving to have bigger brains (more neurons), using more energy (more active neurons), having more cleverly organized circuits, or even attaching plug-in external modules that could help the brain to do difficult things like multiplication.Dr. Balasubramanian's Web SiteFaculty host - Samir Mathur Smith Seminar Room America/New_York public

How smart can you get?: Computational efficiency in neural circuits

Vijay Balasubramanian

 

University of Pennsylvania

The human brain weighs about 1.5 kilograms, and is made up of 100 billion neurons intricately connected via a network so dense that every cubic millimeter of the brain contains 4 kilometers of wire. The electrical activity of this incredibly complex network of neurons makes up human thought, and allows us to move, perceive, talk, plan, love, and do physics, all with a subtlety and precision that escapes the most powerful computers. And to do all this the brain uses only as much power as a refrigerator light bulb! I will discuss the strategies employed by neural circuits (e.g. specialization of function) to efficiently organize the use of power, space and other resources. Using the example of the circuits in the retina, I will argue that efficiency of computation can give a "theory of design" explaining key aspects of the observed architecture. Finally I will argue that there is a law of diminishing returns on these strategies. This law affects our ability to get "smarter" by evolving to have bigger brains (more neurons), using more energy (more active neurons), having more cleverly organized circuits, or even attaching plug-in external modules that could help the brain to do difficult things like multiplication.

Dr. Balasubramanian's Web Site

Faculty host - Samir Mathur