HPC for Life: Genomics, Brain Research, and Beyond

By Warren Froelich

July 19, 2018

Editor’s note: In part I, “HPC Serves as ‘Rosetta Stone’ for the Information Age,” we explored how high-performance computing is transforming digital data into valuable insight and leading to amazing discoveries. Part II follows the path of HPC into new areas of brain research and astrophysics.

During the past few decades, the life sciences have witnessed one landmark discovery after another with the aid of HPC, paving the way toward a new era of personalized treatments based on an individual’s genetic makeup, and drugs capable of attacking previously intractable ailments with few side effects.

Genomics research is generating torrents of biological data to help “understand the rules of life” for personalized treatments believed to be the focus for tomorrow’s medicine. The sequencing of DNA has rapidly moved from the analysis of data sets that were megabytes in size to entire genomes that are gigabytes in size. Meanwhile, the cost of sequencing has dropped from about $10,000 per genome in 2010 to $1,000 in 2017, thus requiring increased speed and refinement of computational resources to process and analyze all this data.

In one recent genome analysis, an international team led by Jonathan Sebat, a professor of psychiatry, cellular and molecular medicine and pediatrics at UC San Diego School of Medicine, identified a risk factor that may explain some of the genetic causes for autism: rare inherited variants in regions of non-code DNA. For about a decade, researchers knew that the genetic cause of autism partly consisted of so-called de novo mutations, or gene mutations that appear for the first time. But those sequences represented only 2 percent of the genome. To investigate the remaining 98 percent of the genome in ASD (autism spectrum disorder), Sebat and colleagues analyzed the complete genomes of 9,274 subjects from 2,600 families, representing a combined data total on the range of terabytes.

As reported in the April 20, 2018, issue of Science, DNA sequences were analyzed with Comet, along with data from other large studies from the Simons Simplex Collection and the Autism Speaks MSSNG Whole Genome Sequencing Project.

“Whole genome sequencing data processing and analysis are both computationally and resource intensive,” said Madhusudan Gujral, an analyst with SDSC and co-author of the paper. “Using Comet, processing and identifying specific structural variants from a single genome took about 2 ½ days.”

SDSC Distinguished Scientist Wayne Pfeiffer added that with Comet’s nearly 2,000 nodes and several petabytes of scratch space, tens of genomes can be processed at the same time, taking the data processing requirement from months down to weeks.

In cryo-Electron Microscopy (cryo-EM), biological samples are flash-frozen so rapidly that damaging ice crystals are unable to form. As a result, researchers are able to view highly-detailed reconstructed 3D models of intricate, microscopic biological structures in near-native states. Above is a look inside of one of the cryo-electron microscopes available to researchers at the Timothy Baker Lab at UC San Diego. Image credit: Jon Chi Lou, SDSC

Not long ago, the following might have been considered an act of wizardry from a Harry Potter novel. First, take a speck of biomolecular matter, invisible to the naked eye, and then deep-freeze it to near absolute zero. Then, blast this material, now frozen in time, with an electron beam. Finally, add the power of a supercomputer aided by a set of problem-solving rules called algorithms. And, presto! A three-dimensional image of the original biological speck appears on a computer monitor at atomic resolution. Not really magic or even sleight-of-hand, this innovation – given the name of cryo-electron microscopy or simply cryo-EM — garnered the 2017 Nobel Prize in chemistry for the technology’s invention in the 1970s.

Today, researchers seeking to unravel the structure of proteins in atomic detail, in hopes of treating many intractable diseases, are increasingly turning to cryo-EM as an alternative to time-tested X-ray crystallography. A key advantage of the cryo-EM is that no crystallization of the protein is required, a barrier for those proteins that defy being turned into a crystal. Even so, the technology didn’t take off until the development of more sensitive electron detectors and advanced computational algorithms needed to turn reams of data into often aesthetically pleasing three-dimensional images.

“About 10 years ago, cryo-EM was known as blob-biology,” said Robert Sinkovits, director of scientific computing applications at SDSC. ”You got an overall shape, but not at the resolution you would get with X-ray crystallography, which required working with a crystal. But it was kind of a black art to create these crystals and some things simply wouldn’t crystalize. You can use cryo-EM for just about anything.”

Several molecular biologists and chemists at UC San Diego are taking advantage of the university’s cryo-EM laboratory and SDSC’s computing resources, to reveal the inner workings and interactions of several targeted proteins critical to the understanding of diseases such as fragile X syndrome and childhood liver cancer.

“This will be a growing area for HPC, in part, as we continue to automate the process,” said Sinkovits.

Machine Learning and Brain Implants

It’s a concept that can boggle the brain, and ironically is now being used to imitate that very organ. Called “machine learning,” this innovation typically involves training a computer or robot on millions of actions so that the computer learns how to derive insight and meaning from the data as time advances.

Recently, a collaborative team led by researchers at SDSC and the Downstate Medical Center in Brooklyn, N.Y., applied a novel computer algorithm to mimic how the brain learns, with the aid of Comet and the Center’s Neuroscience Gateway. The goal: to identify and replicate neural circuitry that resembles the way an unimpaired brain controls limb movement.

The study, published in the March-May 2017 issue of the IBM Journal of Research, laid the groundwork to develop realistic “biomimetric neuroprosthetics” – brain implants that replicate brain circuits and function – that one day could replace lost or damaged brain cells from tumors, stroke or other diseases.

The researchers trained their model using spike-timing dependent plasticity (STDP) and reinforced learning, believed to be the basis for memory and learning in mammalian brains. Briefly, the process refers to the ability of synaptic connections to become stronger based on when they are activated in relation to each other, meshed with a system of biochemical rewards or punishments that are tied to correct or incorrect decisions.

“Only the fittest individual (models) remain, those models that are better able to learn better, survive and propagate their genes,” said Salvador Dura-Bernal, a research assistant professor in physiology and pharmacology with Downstate, and the paper’s first author.

As for the role of HPC in this study: “Since thousands of parameter combinations need to be evaluated, this is only possible by running the simulations using HPC resources such as those provided by SDSC,” said Dura-Bernal. “We estimated that using a single processor instead of the Comet system would have taken almost six years to obtain the same results.”

On the Horizon

Other impressive data producers are waiting in the wings posing further challenges on tomorrow’s super facilities. For example, an ambitious upgrade to the Large Hadron Collider will result in a substantial increase in the intensity of proton beam collisions, far greater than anything built before. From the mid-2020s forward, the experiments at the LHC are expected to yield 10 times more data each year than the combined output of data generated during the three-years leading up to the Higgs discovery. Beyond that, future accelerators are being discussed that would be housed in 100-km long tunnels to reach collision energies many times that of the LHC, while still others are suggesting the construction of colliders based on different geometric shapes, perhaps linear rather than ring. More powerful machines, by definition, will translate into torrents of more data to digest and analyze.

The future site of the Simons Observatory, located in the high Atacama Desert in Northern Chile inside the Chajnator Science Preserve (photo licensed under CC BY-SA 4.0)

Under an agreement with the Simons Foundation Flatiron Institute, SDSC’s Gordon is being re-purposed to provide computational support for the POLARBEAR and successor project called the Simon Array. The projects — led by UC Berkeley and funded first by the Simons Foundation and then the NSF under a five-year, $5 million grant — will deploy the most powerful cosmic microwave background (CMB) radiation telescope and detector ever made to detect what are, in essence, the leftover ‘heat’ from the Big Bang in the form of microwave radiation.

“The POLARBEAR experiment alone collects nearly one gigabyte of data every day that must be analyzed in real time,” said Brian Keating, a professor of physics at UC San Diego’s Center for Astrophysics & Space Sciences and co-PI for the POLARBEAR/Simons Array project.

“This is an intensive process that requires dozens of sophisticated tests to assure the quality of the data. Only be leveraging resources such as Gordon are we able to continue our legacy of success.”

“As the scale of data and complexity of these experimental projects increase, it is more important than ever before that centers like SDSC respond by providing HPC systems and expertise that become part of the integrated ecosystem of research and discovery,” said Norman.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

NSF Budget Approved for $8.3B in 2020, a 2.5% Increase

January 16, 2020

The National Science Foundation (NSF) has been spared a President Trump-proposed budget cut that would have rolled back its funding to 2012 levels. Congress passed legislation last month that sets the budget at $8.3 bill Read more…

By Staff report

NOAA Updates Its Massive, Supercomputer-Generated Climate Dataset

January 15, 2020

As Australia burns, understanding and mitigating the climate crisis is more urgent than ever. Now, by leveraging the computing resources at the National Energy Research Scientific Computing Center (NERSC), the U.S. National Oceanic and Atmospheric Administration (NOAA) has updated its 20th Century Reanalysis Project (20CR) dataset... Read more…

By Oliver Peckham

Atos-AMD System to Quintuple Supercomputing Power at European Centre for Medium-Range Weather Forecasts

January 15, 2020

The United Kingdom-based European Centre for Medium-Range Weather Forecasts (ECMWF), a supercomputer-powered weather forecasting organization backed by most of the countries in Europe, has signed a four-year, $89-million Read more…

By Oliver Peckham

Julia Programming’s Dramatic Rise in HPC and Elsewhere

January 14, 2020

Back in 2012 a paper by four computer scientists including Alan Edelman of MIT introduced Julia, A Fast Dynamic Language for Technical Computing. At the time, the gold standard programming languages for fast performance Read more…

By John Russell

Quantum Computing, ML Drive 2019 Patent Awards

January 14, 2020

The dizzying pace of technology innovation often fueled by the growing availability of computing horsepower is underscored by the race to develop unique designs and application that can be patented. Among the goals of ma Read more…

By George Leopold

AWS Solution Channel

Challenging the barriers to High Performance Computing in the Cloud

Cloud computing helps democratize High Performance Computing by placing powerful computational capabilities in the hands of more researchers, engineers, and organizations who may lack access to sufficient on-premises infrastructure. Read more…

IBM Accelerated Insights

Intelligent HPC – Keeping Hard Work at Bay(es)

Since the dawn of time, humans have looked for ways to make their lives easier. Over the centuries human ingenuity has given us inventions such as the wheel and simple machines – which help greatly with tasks that would otherwise be extremely laborious. Read more…

Andrew Jones Joins Microsoft Azure HPC Team

January 13, 2020

Andrew Jones announced today he is joining Microsoft as part of the Azure HPC engineering & product team in early February. Jones makes the move after nearly 12 years at the UK HPC consultancy Numerical Algorithms Gr Read more…

By Staff report

Atos-AMD System to Quintuple Supercomputing Power at European Centre for Medium-Range Weather Forecasts

January 15, 2020

The United Kingdom-based European Centre for Medium-Range Weather Forecasts (ECMWF), a supercomputer-powered weather forecasting organization backed by most of Read more…

By Oliver Peckham

Julia Programming’s Dramatic Rise in HPC and Elsewhere

January 14, 2020

Back in 2012 a paper by four computer scientists including Alan Edelman of MIT introduced Julia, A Fast Dynamic Language for Technical Computing. At the time, t Read more…

By John Russell

White House AI Regulatory Guidelines: ‘Remove Impediments to Private-sector AI Innovation’

January 9, 2020

When it comes to new technology, it’s been said government initially stays uninvolved – then gets too involved. The White House’s guidelines for federal a Read more…

By Doug Black

IBM Touts Quantum Network Growth, Improving QC Quality, and Battery Research

January 8, 2020

IBM today announced its Q (quantum) Network community had grown to 100-plus – Delta Airlines and Los Alamos National Laboratory are among most recent addition Read more…

By John Russell

HPCwire Awards Highlight Supercomputing Achievements in the Sciences

January 7, 2020

In November at SC19 in Denver, the HPCwire Readers’ and Editors’ Choice awards program celebrated its 16th year of honoring remarkable achievements in high-performance computing. With categories ranging from Best Use of HPC in Energy to Top HPC-Enabled Scientific Achievement, many of the winners contributed to groundbreaking developments in the sciences. This editorial highlights those awards. Read more…

By Oliver Peckham

Blasts from the (Recent) Past and Hopes for the Future

December 23, 2019

What does 2020 look like to you? What did 2019 look like? Lots happened but the main trends were carryovers from 2018 – AI messaging again blanketed everything; the roll-out of new big machines and exascale announcements continued; processor diversity and system disaggregation kicked up a notch; hyperscalers continued flexing their muscles (think AWS and its Graviton2 processor); and the U.S. and China continued their awkward trade war. Read more…

By John Russell

ARPA-E Applies ML to Power Generation Designs

December 19, 2019

The U.S. Energy Department’s research arm is leveraging machine learning technologies to simplify the design process for energy systems ranging from photovolt Read more…

By George Leopold

Focused on ‘Silicon TAM,’ Intel Puts Gary Patton, Former GlobalFoundries CTO, in Charge of Design Enablement

December 12, 2019

Change within Intel’s upper management – and to its company mission – has continued as a published report has disclosed that chip technology heavyweight G Read more…

By Doug Black

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

SC19: IBM Changes Its HPC-AI Game Plan

November 25, 2019

It’s probably fair to say IBM is known for big bets. Summit supercomputer – a big win. Red Hat acquisition – looking like a big win. OpenPOWER and Power processors – jury’s out? At SC19, long-time IBMer Dave Turek sketched out a different kind of bet for Big Blue – a small ball strategy, if you’ll forgive the baseball analogy... Read more…

By John Russell

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Julia Programming’s Dramatic Rise in HPC and Elsewhere

January 14, 2020

Back in 2012 a paper by four computer scientists including Alan Edelman of MIT introduced Julia, A Fast Dynamic Language for Technical Computing. At the time, t Read more…

By John Russell

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

IBM Unveils Latest Achievements in AI Hardware

December 13, 2019

“The increased capabilities of contemporary AI models provide unprecedented recognition accuracy, but often at the expense of larger computational and energet Read more…

By Oliver Peckham

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutt Read more…

By Tiffany Trader

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU Read more…

By John Russell

Top500: US Maintains Performance Lead; Arm Tops Green500

November 18, 2019

The 54th Top500, revealed today at SC19, is a familiar list: the U.S. Summit (ORNL) and Sierra (LLNL) machines, offering 148.6 and 94.6 petaflops respectively, Read more…

By Tiffany Trader

51,000 Cloud GPUs Converge to Power Neutrino Discovery at the South Pole

November 22, 2019

At the dead center of the South Pole, thousands of sensors spanning a cubic kilometer are buried thousands of meters beneath the ice. The sensors are part of Ic Read more…

By Oliver Peckham

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed instances for storage workloads. The fourth-generation Azure D-series and E-series virtual machines previewed at the Rome launch in August are now generally available. Read more…

By Tiffany Trader

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This