adsense code

Saturday, March 12, 2016

Could Your Brain Store All the Information on the Web?

The title sounds outrageous. But supporting data comes from research at the most prestigious Salk Institute. Other researchers had made enormous storage capacity estimates for brains, but this new estimate is 10 times greater. The estimate is on the order of petabytes, as much as the entire World Wide Web.

How does anybody come up with such estimates? What is the basic premise? First, memory is quantified in terms of number of bits of information that can be stored and recovered. In the case of brain, the question is how much information can be stored in a synapse, the communicating junction between neurons. Size and operational strength of a synapse are the key variables: strength can be measured in bits and strengths correlate with umber and size synaptic size. Under high magnification, synapses look like a bunch of beads on a string. The newborn brain, there are relatively few “beads,” but these increase in number and size as the baby grows and learns things. Sadly, in old age, many of these beads disappear unless the brain is kept very active.

In the Salk study of rat brains, electron micrographs of the memory-forming structure, the hippocampus, allowed 3-D reconstruction and detection of the diameters of synapses, which are the target synaptic structures in neurons. Synaptic strength correlates with storage capacity, and the strength is measured by the size of synapses, which appear as a round bead attached to  a stalk or neck to the supporting neuronal membrane. The size of the bead varies with synaptic strength because synaptic strength is created by more molecular machinery for mediating synaptic communication. Thus, spine bead size is a proxy for synaptic strength and storage capacity.

The investigators found that a small cube of brain tissue contained 26 different bead sizes, each associated with a distinct synaptic strength. The authors state that this equates to an approximate 4.7 bits of information at each synapse. Multiply 4.7 times the trillions of synapses and neurons in the brain and you get phenomenal storage capacity.

While I marvel at the elegant complexity of these research methods, I think the interpretations are a bit simplistic. There are some caveats that the authors overlooked. For one, an assumption is made that that the number of storage bits equals the logarithm of the number of bead sizes. The “bit” is a unit of information in digital communications. Information theory holds that one bit is the probability of a binary random variable that is 0 or 1, or more loosely defined as present or absent. One has to take some liberties to apply this concept to memory storage in brain, because the brain is not a digital computer. It is an analog biological computer.

Then there is the problem that the hippocampus deals only with forming declarative and episodic memories, not procedural memories like touch typing or playing the piano. Thus, the storage capacity, whatever it is, is not estimated for procedural memories. Secondly, declarative and episodic memories are not stored in the hippocampus, but rather stored in a distributed fashion throughout the brain. Since synaptic measures were made only on hippocampal tissue, there are no data for the rest of the brain.

But there is a larger issue. How does one know how many bits it takes to represent different memories? Not all memories are the same and surely they don't all require the same number of storage bits.

Actually, the exact number of bits of information that brains can store is rather irrelevant. By any measure, common experience teaches that nobody utilizes all their memory capacity. Moreover, the amount of information a given person stores varies profoundly depending on such variables as motivation to remember things, use of mnemonics, and level of education. The question that needs answering, given that we have vast amounts of unused storage capacity, is "Why don't we remember more than we do?"  Books like my Memory Power 101 provide some practical answers.

Source:


Bartol, Thomas M. et al. (2015).  Neuroconnectomic upper bound on the variability of synaptic plasticity.  eLIfe. Nov. 30. http://dx.doi.org/10.7554/eLife.10778