Human brains can hold 10 times more information than thought, equivalent to the entire internet

Scientists have also discovered how the brain manages to be so efficient, potentially allowing those same discoveries to be used in computers

Andrew Griffin
Friday 22 January 2016 12:02 EST
Comments
(Alamy)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

The human brain might be able to hold 10 times more information than had previously been thought, and we can store information roughly equivalent with the entire internet.

The findings also help show why the brain is quite so efficient. It only uses about 20 watts of continuous power — about as much as a very dim light bulb.

The information could help scientists build new fast and efficient computers, as well as allowing them to learn far more about how the human brain works.

Scientists have used new techniques to study the computing power of the brain — and found that it is far more than we had previously thought.

“We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power,” said Terry Sejnowski, Salk professor and co-senior author of the paper, which was published in eLife. “Our new measurements of the brain's memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.”

The electrical and chemical activity that flows through the brain does through synpases. Those had to be reconstructed so that scientists could study how the parts of the brain connect to each other, and what size the synapses are.

By plugging in the new size of the synapses to an algorithm, the researchers found that there are far more categories of synapses than had been thought, meaning that far more information could be stored.

"Our data suggests there are 10 times more discrete sizes of synapses than previously thought," said Tom Bartol, a Salk staff scientist who worked on the study. “In computer terms, 26 sizes of synapses correspond to about 4.7 ‘bits’ of information. Previously, it was thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus.”

The synapses move through those various sizes, adjusting themselves with the signals they receive, the scientists said.

“The implications of what we found are far-reaching," adds Sejnowski. "Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us."

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in