Neuroscience’s Big Data Problem

Tim Verstynen objects to Obama’s BRAIN Initiative – which aims to map all neuronal activity in the human brain – on the grounds that neuroscientists have no trouble producing data but have failed to form the unifying theories necessary to understand that data:

The Human Genome Project was useful because the core theory of how sets of deoxyribonucleic acid pairs build proteins and other molecules was already well understood. As soon as they had their DNA road maps, biologists knew how to drive their cars around to get to new places.

Unfortunately, in neuroscience we’re still learning how to drive our cars. As the neurologist Robert Knight is fond of saying, twenty years ago we knew 1% about how the brain works and now we’ve doubled it.

We don’t have core fundamental theories that will allow us to make sense of the data that we will get from large research initiatives like BRAIN and the Human Connectome Project. So we can get our detailed maps of the brain all we want, but it won’t make any more sense to us than it does now…. If the President and the federal funding agencies really wanted to find cost-effective ways of understanding the human brain they would start by providing more infrastructure for building better theories of how the brain works first.

Previous Dish on neuroscience-in-infancy here.