In a recent blog entry, IBM’s Thorsten Mühge discusses the emerging field of “brain inspired computing” in the context of its potential to revolutionize the computer industry. The research area is taking off as an alternative to the current “Von Neumann” architecture, which extends back to 1945. Writes Mühge: “This design is extremely powerful for analytic tasks like calculations, regression analysis etc. State of the art computers can execute tasks with ultra-high speed of up to 5 Ghz or an equivalent of 0.2 nano-seconds per operation. In addition they can access a massive amount of information stored in their memory. However, there are major disadvantages of this current computer technology.”
The primary drawbacks are high energy consumption and a limited path forward as CMOS scaling bumps up against the fundamental barriers of physics. With the end of an exponential in sight, chip designers are pursuing various strategies to sustain progress in computing. Brain-inspired computing is among the top post-silicon contenders due in large part to its innate energy-efficiency.
Mühge puts forward some estimates for the brain that put its total number of operations per second at 10¹⁶, which happens to be roughly the same number of operations that current supercomputers are able to execute. But where supercomputers are using a megawatt or more energy for this task, the brain does it with 20 watts of energy. This is the equivalent of the electricity consumption of a city district versus the electricity consumption of a lightbulb.
Energy-efficient computing technologies like brain-derived computing are one of the primary R&D areas of IBM and Forschungszentrum Jülich. The German research group joined the OpenPOWER Foundation in April 2014 to explore new supercomputing technologies. To highlight their partnership, the duo created a short video exploring their joint vision for energy efficient computing.
“The brain is really amazing and we have much to learn about it,” says the Forschungszentrum Jülich rep in the video; “we are investigating many different aspects of the brain.”
“These are tasks in which the computational system needs to integrate a lot of information, some of which is even uncertain or incomplete,” he continues.
To which the IBM rep responds, “if we were able to understand how the brain solves these types of tasks and copy its abilities, this would mean a major step towards ground-breaking new computer capabilities.”
As for why brains are so powerful, it is explained that “human brain cells are organized in a complex network and communicate with each other by exchanging electrical signals. Recent theoretical and modeling studies suggest that the noise we observe in recordings of brain activity is an important ingredient for healthy brain function.”
Jülich scientists are exploring the origins of this noise in the human brain and investigating how to replicate the approach in computers, and possibly put it to use in future generations of smart phones.
One of the keys to this research is powerful simulation software and IT infrastructure. To cover these needs, Jülich researchers are developing NEST, the neural simulation tool, which is widely used all over the world. Designed to run on a variety of systems, from laptops to clusters to the biggest supercomputers on the planet, NEST code is maintained and steadily improved by a core team of developers and other community members.