Visit additional Tabor Communication Publications
July 11, 2011
Can the human brain devise a system capable of understanding itself? That's been something brain simulation researchers have been working toward for nearly a decade. With recent advances in supercomputing capabilities and modeling techniques, the question may soon be answered.
Understanding the fundamental workings of the brain would revolutionize neuroscience. It is estimated that about a quarter of the population in the US and Europe has some sort of brain disorder, spanning everything from anxiety attacks and mild depression to Alzheimer's and full-blown neuroses. Brain-related health care costs currently amount to over a trillion dollars per year in the West, along with another trillion in lost productivity. To put it mildly, it is a problem that needs fixing.
The brain simulation SpiNNaker project, which recently got is first shipment of custom-built supercomputer chips, is the UK's contribution to the effort. Far better known (and better funded) is the Blue Brain Project, headed by neuroscientist-turned-informatics-specialist Henry Markram, and run out of the École Polytechnique Fédérale in Lausanne (EPFL).
Markram is now advocating for an even more ambitious endeavor. Known as the Human Brain Project, it is a 10 year effort estimated to cost a billion euros. The goal is to build upon the knowledge accumulated under the Blue Brain work -- software models, tools, supercomputing expertise -- and create a multi-level simulation of the entire human brain. That will require exascale-level hardware and the software to exploit it.
At the International Supercomputing Conference in Hamburg last month, Markram described his current work and the future of virtual neuroscience. In his keynote address (available online here), he noted that the pharmaceutical industry, which funds 95 percent of the research in this area, expends its resources on a just handful of the 560 clinically classified brain disorders, in this case, the diseases that have the most attractive payoff from a drugmaker's perspective. The other 5 percent of the funding comes from academia, which is tasked to research and develop possible treatments for the vast majority of neurological conditions.
In both cases though, the approach has been reductionist: to focus on the specific neurological structures and mechanisms that underlie a disorder. From Markram's perspective, that strategy has led to only piecemeal progress. "The solution," he says, "is to integrate it all... using simulations, into a unified model."
We asked Markram to talk about the work he's been doing on the Blue Brain Project and give us his vision of how human brain simulation will advance in the coming years.
HPCwire: Could you give us an overview of your work and what you intend to accomplish?
Henry Markram: Our mission has been to establish a radically new way approach to understanding the brain. The best way to describe it is, ICT powered biology. It is a highly integrative approach of building the brain using biological rules and data intensive computing. The brain is built on biological rules and it stands to reason you can use the same rules to build a model that generates many of the brain's functions. The advantage of this approach is that when a function does emerge, we can actually trace a meaningful biological basis for that function.
Diseases are also likely to result where ever a rule can break and so searching for vulnerable rules provides a new strategy for predicting causes of diseases. Each step involves searching for patterns of organization in biological data (informatics), deriving rules (algorithms) and using the rules to build a new generation of models (modeling) for simulation testing (simulations). Simulating the new generation models reveals the strengths and weaknesses of the rules which we can use to refine the rules and also to find new rules that we can use to build even more accurate models.
It is a rule discovery process -- a telescope into the brain which does not only depend on the hardware and software for its resolution, but also on the rules. As we build the "telescope" we can look deeper and wider into the brain, and it helps us build a better telescope: better software, better hardware, and better rules. We need constant innovations in supercomputing technologies, informatics technologies -- the hardware and software. These allow us to build larger and larger brain models with more and more detail.
In 2008 we could build and simulate 10,000 neurons and 10 million synapses using a Blue Gene/L supercomputer. Today we can build models with 1 million neurons and 1 billion synapses using a Blue Gene/P supercomputer. These are not point neurons as in artificial neural networks or neuromorphic computing. They are the most detailed and accurate models of real neurons ever built. And we also now build them automatically now and we are learning how to synthesize them using basic rules.
There is a long way to go, but we have built the ICT infrastructure that now allows us to move faster. It is the first version of a platform for ICT powered biology. The models only get better and so it is a one-way track to understanding the rules that build the brain. In the process, it also provides a roadmap for supercomputing of the future. Understanding the rules also reveals computational principles which we aim to exploit in artificial neural networks and neuromorphic computing - exporting simplified circuit designs which desired functions.
We have put neuroscience on the IT highway and we will now be able to move exponentially faster. We have found dozens of new rules in the process demonstrating that this form of IT powered biology is a powerful new way of systematically integrating what we know about the brain and using ICT to chart new territory of the brain that would take experimental biology many decades to reach.
HPCwire: Who is funding the work and about how much money is involved?
Markram: We had to buy IBM's supercomputers and this was bought not only for the Blue Brain Project, but also for many other projects. It has been funded by the universities in the area collectively. For operations, the first prototype phase till now has not been very expensive -- similar to a large RO1 grant that most US scientists run on -- an average of a million Swiss francs per year. To continue will cost much more and that is why we are proposing the Human Brain Project to the European Union. With a budget of around €100M per year, we can pick up speed.
HPCwire: What kind of computers are currently at your disposal? What are the current limitations of these systems in regard to the simulations?
Markram: We now have a Blue Gene/P with over 16,000 cores. Of course we are at the computing limit, we constantly need more computing power. This is perhaps the most extreme challenge for supercomputing. We will need an exascale system to simulate the human brain at a cellular level and with the capability of performing molecular resolution simulations only for zoomed areas of activity. We need to boost exascale to go beyond that.
HPCwire: What does the output of the simulation look like?
Markram: Just like experiments on real brain tissue. In other words, we can do electrical recordings, we can record the transmission between neurons or networks of neurons, we can image activity of all the neurons, we can record electrical fields generated by one neuron or all the neurons together, and so on. But, we can perform experiments that are not yet possible in the experimental lab and will not be possible for a very long time. We can record or map any parameter that we used to build the model. We can also map searched patterns of activity, etc. It is a very powerful "lab" and designed for biology-style experimentation. Like a virtual laboratory.
HPCwire: About how many lines of code are we talking about?
Markram: It is not one piece of code, it is a huge ecosystem of code that deals with the informatics, brain building, simulation, visualization, analysis, virtual lab experiments, real-time uplinks, etc. We have not counted all the lines of code, but it is long and growing.
HPCwire: What have you learned from the work so far?
Markram: Most importantly we established the infrastructure to do this. It is unique in the world. We have not just solved a computer science problem, but also the ultimate integration of computer science and neuroscience. We had to solve dozens of problems to get to a workable ecosystem where we can build models according to biological rules. In terms of understanding the brain, we discovered many rules that would not be possible to find experimentally.
We found general rules that now allow automated building of very accurate neurons; we found general rules that help us connect any neuron to another - the so called "connectome;" we found general rules for robustness and invariance of neural circuits, that is, we know what neural circuits are resistant to damage and what makes neural circuits that same even when the elements are different; we found general rules for emergent properties as we add columns, and many more. We also found new computing strategies that will add a new dimension to my previous discovery of liquid computing with Wolfgang Maass.
HPCwire: What is the next step for your work?
Markram: We are expanding the capability and capacity of the ICT infrastructure to allow building of a whole brain (rodent level) and to build neurons models with molecular level detail. To do this we will need to make the next step in computing power with a petascale supercomputer.
HPCwire: At what point do you think you'll be able to simulate a complete human brain?
Markram: I always say 10 years because I believe it is technically possible in 10 years. But the clock can only start ticking once we get the proper funding to go beyond this initial stage. It cannot be done on thin air. If we get the FET Flagship grant in 2013, then by 2023-2024 we will be capable of assembling all we know to build a human brain model. If we don't get the funding it will take decades longer.
HPCwire: Will such a simulation exhibit the same properties as the organic version? Do you think features like creativity and emotions could emerge? How about consciousness?
Markram: This is a research tool, it is not a toy to see what will happen if one builds a brain. It is not a magical model that suddenly explains to you all the secrets of the brain. It is a model that takes our knowledge to built it in the first place. We learn at each step and will probably understand most of the key principles well before we build the first model of the human brain. It is a research tool for collaborative in silico experiments and hypothesis testing.
If the model is built on biological rules and we can implement these rules accurately enough, many functions should emerge without us having to explicitly program them in. If it does not, then obviously we missed something or we could not capture the detail with sufficient accuracy. Such a "failure" is also a great success and just as important as when function does emerge, since it means that all that has gone into the model is just not enough.
Everyone argues about how much detail is needed for complex functions. Well this way, we will not have to argue. Either way we learn. You can't lose with this approach.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.