Visit additional Tabor Communication Publications
February 26, 2013
The Obama administration has revealed plans for an ambitious decade-long brain mapping project, similar in scope to the Human Genome Project.
Remarks made by the President in his 2013 State of the Union speech were soon confirmed by this Tweet from National Institutes of Health Director Francis S. Collins: "Obama mentions the #NIH Brain Activity Map in #SOTU."
|Source: Human Connectome Project|
A more formal acknowledgement came from National Institute of Neurological Disorders and Stroke Director Story C. Landis. Cited in the New York Times article that broke the story, Landis also connected Obama's statements to the Brain Activity Map (BAM) project.
The genesis for the project can be traced back to a scientific article published in last June's Neuron.
"The function of neural circuits is an emergent property that arises from the coordinated activity of large numbers of neurons," writes the six-author team. "To capture this, we propose launching a large-scale, international public effort, the Brain Activity Map Project, aimed at reconstructing the full record of neural activity across complete neural circuits. This technological challenge could prove to be an invaluable step toward understanding fundamental and pathological brain processes."
The journal article outlines several ways the mapping could be approached and points to potential treatments for schizophrenia and autism.
Parties involved in the project's planning estimate it will cost at least $300 million a year, or $3 billion over the 10-year span. By comparison, the Human Genome Project totaled $3.8 billion. That initiative, which sought the complete mapping of human genome, finished ahead of schedule in April 2003, and according to a federal impact study showed a return of $800 billion by 2010.
A lot is being made of the similarity of these two projects and the potential for big science spending to invigorate the economy.
"Every dollar we invested to map the human genome returned $140 to our economy – every dollar," Obama said. "Today our scientists are mapping the human brain to unlock the answers to Alzheimer's. They're developing drugs to regenerate damaged organs, devising new materials to make batteries 10 times more powerful. Now is not the time to gut these job-creating investments in science and innovation. Now is the time to reach a level of research and development not seen since the space race. We need to make those investments."
But how alike are these two projects really? The scientific consensus is that mapping and understanding the brain is a far more complex endeavor than a full accounting of human DNA.
Dr. Ralph J. Greenspan, one of the authors of the Neuron paper, highlighted the distinction:
"It's different in that the nature of the question is a much more intricate question. It was very easy to define what the genome project's goal was. In this case, we have a more difficult and fascinating question of what are brainwide activity patterns and ultimately how do they make things happen?"
According to NYT reporting, BAM is a joint project of the National Institutes of Health, the Defense Advanced Research Projects Agency and the National Science Foundation and will be organized by the Office of Science and Technology Policy. The Howard Hughes Medical Institute in Chevy Chase, Md., and the Allen Institute for Brain Science in Seattle were listed as private partners.
The US brain mapping project comes on the heels of the Swiss brain modeling project, unveiled last month. The European Commission just awarded half a billion euros to the Human Brain Project – an extension of Henry Markram's Blue Brain project that aims "to simulate a complete human brain in a supercomputer."
The BAM project, on the other hand, is working to create a functional map of the active human brain. The six contributors to the Neuron article write that "understanding how the brain works is arguably one of the greatest scientiﬁc challenges of our time." Despite the inevitable difficulties, it looks like the research community is eager to unlock the mysteries of this frontier.
The journal article concludes with this call to action:
To succeed, the BAM Project needs two critical components: strong leadership from funding agencies and scientiﬁc administrators, and the recruitment of a large coalition of interdisciplinary scientists. We believe that neuroscience is ready for a large-scale functional mapping of the entire brain circuitry, and that such mapping will directly address the emergent level of function, shining much-needed light into the "impenetrable jungles'' of the brain.
Further details are expected when Obama unveils his budget next month.
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.