New research shows the brain’s memory capacity is ten times greater than previous estimates. That means it’s in the petabyte range — which puts it close to World Wide Web territory.
The human brain is often compared to a computer. One aspect in particular that lends itself well to these sorts of comparisons is memory. When computer scientists talk about memory, they refer to RAM (random access memory) and hard drive storage. When referring to the human brain, neuroscientists speak of short term memory, which is like RAM, and long term memory, which is akin to a hard drive. Consequently, scientists find it helpful to analogise our brain’s storage capacity to a computer, which explains why they measure it in bits and bytes.
Unfortunately, there’s a lack of consensus on how much information our brains are capable of storing. Estimates vary from one terabyte to 100 terabytes to 2500 terabytes (a terabyte being 1000 gigabytes). But as a new study from the Salk Institute shows, these estimates appear to be an order of magnitude too low. By creating a computational reconstruction of a segment of a rat’s brain, a team led by the Salk Institute’s Terry Sejnowski has shown that the human brain’s memory capacity is actually in the petabyte range. The details of their work can now be found the science journal eLife.
“This is a real bombshell in the field of neuroscience,” said Sejnowski in a press release. “We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power. Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.”
To be fair, it’s in the same general area as the Web, but it’s not quite in the same ballpark as the Web. The Salk researchers appear to be overstating it a bit. If we consider the Big Four — Google, Amazon, Microsoft and Facebook — their servers alone store at least 1200 petabytes between them. That excludes the rest of the Web, including storage providers like Dropbox, Barracuda and SugarSync.
But one petabyte is still a massive amount of data. Expressed numerically, it’s 250 bytes. A good analogy is the total amount of data amassed at the US Library of Congress, which is about 235 terabytes. One petabyte is about four times that. Put yet another way, one petabyte is enough to store the DNA of the entire population of the United States twice over. So our brains may not have the equivalent storage capacity of the entire Web, but it’s still a huge data reservoir.
The team created a computational 3D reconstruction of rat hippocampal tissue (the memory center of the brain), which revealed something quite unexpected. Certain neurons appeared to be sending duplicate messages to receiving neurons. Intrigued, the researchers decided to measure and compare the sizes of two similar synapses, which they hoped would refine their understanding of synaptic size. This contrasts to how neuroscientists typically refer to the sizes of neurons in terms of small, medium, and large. That’s problematic, given that the memory capacity of neurons depends upon the size of synapses.
The researchers discovered that synapses of all sizes vary in increments as small as eight per cent. So there could be as many as 26 categories of sizes of synapses. This “synaptic plasticity” means there are 10 times more discrete sizes of synapses than previously thought. In computational terms, that equates to about 4.7 bits of information. Prior to this study, neuroscientists thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus.
“This is roughly an order of magnitude of precision more than anyone has ever imagined,” said Sejnowski.
This discovery also helps to explain the brain’s surprising efficiency, which could eventually lead to ultraprecise, super-efficient, computers, including those that utilise deep learning and artificial nets. So in the same way that computers are helping us to understand the human brain, these sorts of neural insights are helping us build more efficient and powerful computers.
Top image: Salk Institute