Scientists Using Intel-Cray ‘Theta’ Supercomputer to Map Brain Function

September 14, 2017

Sept. 14, 2017 — A neuroscientist and a computational scientist walk into a synchrotron facility to study a mouse brain… Sounds like a great set-up for a comedy bit, but there is no punchline. The result is cutting-edge science that can only be accomplished in a facility as scientifically integrated as the U.S. Department of Energy’s (DOE) Argonne National Laboratory.

At a casual, or even a more attentive glance, Doga Gursoy and Bobby Kasthuri would seem at opposite ends of the research spectrum. Gursoy is an assistant computational scientist at Argonne’s Advanced Photon Source (APS), a DOE Office of Science User Facility; Kasthuri, an Argonne neuroscientist.

But together, they are using Argonne’s vast arsenal of innovative technologies to map the intricacies of brain function at the deepest levels, and describing them in greater detail than ever before through advanced data analysis techniques.

Gursoy and Kasthuri are among the first group of researchers to access Theta, the new 9.65 petaflops Intel-Cray supercomputer housed at the Argonne Leadership Computing Facility (ALCF), also a DOE Office of Science User Facility. Theta’s advanced and flexible software platform supports the ALCF Data Science Program (ADSP), a new initiative targeted at big data problems, like Gursoy and Kasthuri’s brain connectome project.

ADSP projects explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines.

“By developing and demonstrating rapid analysis techniques, such as data mining, graph analytics and machine learning, together with workflows that will facilitate productive usage on our systems for applications, we will pave the way for more and more science communities to use supercomputers for their big data challenges in the future,” said Venkat Vishwanath, ALCF Data Sciences Group Lead.

All about the connections

This new ADSP study of connectomes maps the connections of every neuron in the brain, whether human or mouse. Determining the location of every cell in the brain and how they communicate with each other is a daunting task, as each cell makes thousands of connections. The human brain, for example, has some 100 billion neurons, creating 100 trillion connections. Even the average mouse brain has 75 million neurons.

This ALCF award targets big data problems and our application of brain imaging does just that,” said Gursoy, assistant computational scientist in the X-Ray Science Division of Argonne’s Advanced Photon Source. “The basic goal is simple — we would like to be able to image all of the neurons in the brain — but the datasets from X-rays and electron microscopes are extremely large. They are at the tera- and petabyte scales. So we would like to use Theta to build the software and codebase infrastructure in order to analyze that data.”

This research was supported by the U.S. Department of Energy’s Office of Science. A portion of the work was also supported by Argonne’s Laboratory-Directed Research and Development (LDRD) program.

The process begins with two imaging techniques that will provide the massive sets of data for analysis by Theta. One is at the APS, where full brains can be analyzed at submicron resolution — in this case, the brain of a petite shrewmouse — through X-ray microtomography, a high-resolution 3-D imaging technique. Argonne’s X-ray Sciences Division of the APS provides the expertise for the microtomography research. Much like a CT scanner, it produces images as micro-thin slices of a material whose structure can be meticulously scrutinized. While this resolution provides a detailed picture of blood vessels and cell bodies, the researchers aim to go still deeper.

That depth of detail requires the use of an electron microscope, which transmits a short-wavelength electron beam to deliver resolution at the nanometer scale. This will allow for the capture of all the synaptic connections between individual neurons at small targeted regions guided by the X-ray microtomography.

For years, scientists at the APS have used these techniques to deepen our understanding of a wide variety of materials, from soil samples to new materials to biological matter,” said Kamel Fezzaa from sector 32-ID at the APS. “By coordinating our efforts with Argonne high-speed computing capabilities for this project, we are able to provide some truly revolutionary images that could provide details about brain functions that we have never before been able to observe.”

Both techniques can produce petabytes of information a day and, according to the researchers, the next generations of both microscopes will increase that amount dramatically.

Images produced by these datasets have to be processed, reconstructed and analyzed. Through the ADSP, Gursoy and Kasthuri are developing a series of large-scale data and computational steps — a pipeline — that integrates exascale computational approaches into an entirely new set of tools for brain research.

Taming of the shrew

The first case study for this pipeline is the reconstruction of an entire adult shrewmouse brain, which, they estimate, will produce one exabyte of data, or one billion gigabytes. And the studies only get bigger from there.

Machine learning will go through these datasets and help come up with predictive models. For this project, it can help with segmentation or reconstruction of the brain and help classify or identify features of interest,” said Vishwanath.

Lessons learned from the smaller shrewmouse brain will be applied to a large mouse brain, which constitutes a 10-fold increase in volume. Comparisons between the two will reveal how organizational structures form during development, from embryo to adult, and how they evolve. The reconstruction of a non-human primate brain, with a volume 100 times larger than a mouse brain, is being considered for a later study.

A neuroscientist and a computational scientist leave a synchrotron facility with studies from a mouse brain . . .  armed with new techniques to analyze this data. The images produced by their work will provide a clearer understanding of how even the smallest changes to the brain play a role in the onset and evolution of neurological diseases, such as Alzheimer’s and autism, and perhaps lead to improved treatments or even a cure.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.


Source: John Spizzirri, Argonne National Laboratory

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Nvidia Debuts Turing Architecture, Focusing on Real-Time Ray Tracing

August 16, 2018

From the SIGGRAPH professional graphics conference in Vancouver this week, Nvidia CEO Jensen Huang unveiled Turing, the company's next-gen GPU platform that introduces new RT Cores to accelerate ray tracing and new Tenso Read more…

By Tiffany Trader

HPC Coding: The Power of L(o)osing Control

August 16, 2018

Exascale roadmaps, exascale projects and exascale lobbyists ask, on-again-off-again, for a fundamental rewrite of major code building blocks. Otherwise, so they claim, codes will not scale up. Naturally, some exascale pr Read more…

By Tobias Weinzierl

STAQ(ing) the Quantum Computing Deck

August 16, 2018

Quantum computers – at least for now – remain noisy. That’s another way of saying unreliable and in diverse ways that often depend on the specific quantum technology used. One idea is to mitigate noisiness and perh Read more…

By John Russell

HPE Extreme Performance Solutions

Introducing the First Integrated System Management Software for HPC Clusters from HPE

How do you manage your complex, growing cluster environments? Answer that big challenge with the new HPC cluster management solution: HPE Performance Cluster Manager. Read more…

IBM Accelerated Insights

Super Problem Solving

You might think that tackling the world’s toughest problems is a job only for superheroes, but at special places such as the Oak Ridge National Laboratory, supercomputers are the real heroes. Read more…

NREL ‘Eagle’ Supercomputer to Advance Energy Tech R&D

August 14, 2018

The U.S. Department of Energy (DOE) National Renewable Energy Laboratory (NREL) has contracted with Hewlett Packard Enterprise (HPE) for a new 8-petaflops (peak) supercomputer that will be used to advance early-stage R&a Read more…

By Tiffany Trader

STAQ(ing) the Quantum Computing Deck

August 16, 2018

Quantum computers – at least for now – remain noisy. That’s another way of saying unreliable and in diverse ways that often depend on the specific quantum Read more…

By John Russell

NREL ‘Eagle’ Supercomputer to Advance Energy Tech R&D

August 14, 2018

The U.S. Department of Energy (DOE) National Renewable Energy Laboratory (NREL) has contracted with Hewlett Packard Enterprise (HPE) for a new 8-petaflops (peak Read more…

By Tiffany Trader

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

Intel Announces Cooper Lake, Advances AI Strategy

August 9, 2018

Intel's chief datacenter exec Navin Shenoy kicked off the company's Data-Centric Innovation Summit Wednesday, the day-long program devoted to Intel's datacenter Read more…

By Tiffany Trader

SLATE Update: Making Math Libraries Exascale-ready

August 9, 2018

Practically-speaking, achieving exascale computing requires enabling HPC software to effectively use accelerators – mostly GPUs at present – and that remain Read more…

By John Russell

Summertime in Washington: Some Unexpected Advanced Computing News

August 8, 2018

Summertime in Washington DC is known for its heat and humidity. That is why most people get away to either the mountains or the seashore and things slow down. H Read more…

By Alex R. Larzelere

NSF Invests $15 Million in Quantum STAQ

August 7, 2018

Quantum computing development is in full ascent as global backers aim to transcend the limitations of classical computing by leveraging the magical-seeming prop Read more…

By Tiffany Trader

By the Numbers: Cray Would Like Exascale to Be the Icing on the Cake

August 1, 2018

On its earnings call held for investors yesterday, Cray gave an accounting for its latest quarterly financials, offered future guidance and provided an update o Read more…

By Tiffany Trader

Leading Solution Providers

SC17 Booth Video Tours Playlist

Altair @ SC17

Altair

AMD @ SC17

AMD

ASRock Rack @ SC17

ASRock Rack

CEJN @ SC17

CEJN

DDN Storage @ SC17

DDN Storage

Huawei @ SC17

Huawei

IBM @ SC17

IBM

IBM Power Systems @ SC17

IBM Power Systems

Intel @ SC17

Intel

Lenovo @ SC17

Lenovo

Mellanox Technologies @ SC17

Mellanox Technologies

Microsoft @ SC17

Microsoft

Penguin Computing @ SC17

Penguin Computing

Pure Storage @ SC17

Pure Storage

Supericro @ SC17

Supericro

Tyan @ SC17

Tyan

Univa @ SC17

Univa

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This