A disappointing symposium: McConnell Brain Imaging Centre 30th Anniversary

Yesterday I attended an all-day symposium at the Montreal Neurological Institute, to celebrate the 30th anniversary of the founding of the McConnell Brain Imaging Centre. I went to hear talks by David Van Essen, Principal Investigator, The Human Connectome Project; Henry Markram, Director, The Blue Brain Project, and Coordinator, The Human Brain Project; and Robert J. Zatorre, founding co-director of the international laboratory for Brain, Music, and Sound research (BRAMS).

I wish that I could say that I came away from the conference excited by visionary ideas and promising new technologies to improve people’s health or heal illness, or significant advances in understanding the brain. What I learned was that the engineers have been busy improving brain imaging technology which results in ever greater amounts of data being generated, but physicians and computer scientists are having a difficult time making sense of all this data, now being measured in terabytes and petabytes. At least the neurosurgeons have higher resolution scans to help guide their operations.

What is missing is a paradigm for reducing all the data down to the essentials, by getting rid of the noise. Music, the topic of Robert Zatorre’s talk, suggests how we need to approach the big data problem. A millennium or more ago, people worked out a code for representing music on paper; this code, musical notation, cannot specify all possible sound frequencies. It focuses on just those frequencies which are found in music; all others are noise and have no representation. Can you imagine if we tried to represent a piece of music by the physics governing each musical instrument and parameters such as string tension, air pressure and temperature, at every moment in time? Too much data!

Music notation quantizes information such as pitch. Instead of infinitely many frequencies which would have to be represented by real numbers (floating point values in computers) the music notation code quantizes, or discretizes, pitch information into just 12 notes (at least for Western music). Each note can be represented in a computer by less than 4 bits.

The approach to modelling the brain that is being pursued by Henry Markram and the Blue Brain Project is also encumbered by massive data. The basic physiology of each neuron is simulated and then replicated millions and eventually billions of times. Enormous amounts of computing power are required to process the massive amounts of data such an approach calls for. In contrast, real brains quantize at every level: neurons are discrete elements; synapses are discrete elements; even neurotransmitters and receptors are discrete molecules. Models based on discrete codes effectively get rid of noise, and turn data into information.

We need a paradigm shift! Unfortunately, we cannot call it quantum information as that term has already been appropriated. Suggestions for a name, anyone?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.