SC16 Precision Medicine Panel Proves HPC Matters

By John Russell

November 16, 2016

In virtually every way, precision medicine (PM) is the poster child for the HPC Matters mantra and was a good choice for the Monday panel opening SC16 (HPC Impacts on Precision Medicine: Life’s Future–The Next Frontier in Healthcare). PM’s tantalizing promise is to touch all of us, not just writ large but individually – effectively fighting disease, enhancing health and lifestyle, extending life, and necessarily contributing to basic science along the way. All of this can only happen with HPC.

Moderated by Steve Conway of IDC, five distinguished panelists from varying disciplines painted a powerful picture of PM’s prospects and challenges. Rather than dwell down-in-the-weeds on HPC technology minutiae, the panel tackled the broad sweep of data-driven science, mixed workload infrastructure, close collaboration across domains and organizations, and the need to make use of incremental advances while still pursuing transformational change.

It was a conversation with wide scope and difficult to summarize. Here are the panelists and a sound bite from their opening comments:

  • Mitchell Cohen, director of surgery, Denver Health Medical Center; Professor, University of Colorado School of Medicine. “If you get shot, stabbed, or run over I am your guy – a good person not to need,” quipped Cohen, momentarily underplaying his equal strength in basic medical research.
  • Warren Kibbe, director, Center for Biomedical Informatics and Information Technology (CBIIT); CIO, acting deputy director, National Cancer Institute. “[ACS] estimates there will be 1.7M new cases of cancer in the U.S. along and 14M worldwide this year. Six hundred thousand will die. [However] the mortality rate in cancer has been declining year since about 2000 so we are doing something right but it’s clear we need to understand more about basic biology,” said Kibbe.
  • Steve Scott, chief technology officer, Cray Inc. “I’m the computer guy. We tend to talk about Pflops [and the like]. The real disconnect is between the computational science world and clinical scientist and physicians. We need build solutions those people can use,” said Scott who dove a bit deeper into the simulation and analytics technologies and the computer architecture required to deliver PM.
  • Fred Streitz, LLNL
    Fred Streitz, LLNL

    Fred Streitz, chief computational scientist and director of the High Performance Computing Innovation Center at Lawrence Livermore Lab.[i] Talking about a population scale data collection pilot that’s part of the CANcer Distributed Learning Environment (CANDLE), said Streitz: “[It’s] where rubber hits the roads. It’s focused on [establishing] an effective national cancer surveillance program that takes advantage of all of the data we currently have and are already collecting in different ways and states – [and will first use] natural language processing to makes sense of the data and regularize the data, and, then use machine learning to extract the information in a useful way.”

  • Martha Head, senior director, The Noldor; acting head, Insights from Data at GlaxoSmithKline Pharmaceuticals. She tackled the lengthy and problematic drug R&D cycle (a decade) from hypothesis to therapy. “We have to go faster [and not] with the same processes and just rushing ever faster. We need transformation, a new approach that combines simulation HPC and data analytics with experiment – a new engineering paradigm that almost treats an experiment as a subroutine or a function in a larger algorithm that we are running in our drug discovery process,” said Head.

Setting the stage, Conway emphasized zeroing in on the most appropriate care and preventative treatment is also financially imperative. The U.S. spent about $3 Trillion on healthcare in 2014 and is headed to $4.8 Trillion in 2021. Other countries do a bit better, with healthcare spending claiming 9-11 percent of GDP, yet that too is alarming.

PM, he said, will not only help save lives but also curb costs. It’s also becoming an important HPC market, so much that IDC is tracking dozens of healthcare initiatives around the world and will add PM as a new market segment it tracks within commercial analytics. Clearly the stakes are high.

Warren Kibbe, NCI
Warren Kibbe, NCI

Kibbe, a key player in NCI’s Moonshot program, is a powerful advocate of HPC tools’ capacity to advance medicine through database creation, machine-learning based techniques, and a variety of simulation. That said, he cautioned, the biggest hurdle remains unknown biology. We simply do not know enough basic biology. This is a point echoed by a few others. Basic research as part of PM overall will help.

The Cancer Moonshot, he noted, has been carefully road-mapping what it thinks can be impactful and done. CANDLE is one of those efforts. He noted a blue ribbon NCI panel has spelled out clear objectives in a publically available report. Here are a few of its directional findings:

These and other efforts, driven by HPC, will work over time. One example is creation of the NCI Genomic Data Commons intended to provide the cancer research community with a unified data repository that enables data sharing across cancer genomic studies in support of precision medicine. “I want to give a shout out to Bob Grossman and his team at the University of Chicago,” said Kibbe of the project. The idea is “to help take data out of existing repositories and get it into the cloud so people can use cloud computing more effectively.”

Kibbe offered a realistically measured view of the Cancer Moonshot’s goal. It will make significant, meaningful progress, but it’s a long road towards whatever it is that actually constitutes a cure for al cancers. Head of GSK agreed and emphasized the value of public-private collaborations like the one GSK has with NCI.

As described by NCI, “Department of Energy, NCI, and GlaxoSmithKline are forming a new public–private partnership designed to harness high-performance computing and diverse biological data to accelerate the drug discovery process and bring new cancer therapies from target to first in human trials in less than a year. This partnership will bring together scientists from multiple disciplines to advance our understanding of cancer by finding patterns in vast and complex datasets to accelerate the development of new cancer therapies.”

Given the wealth of genomics data and the relative paucity of mechanistic information, pattern recognition and database analysis have been primary tools in pursuing PM. Recent advancement in these data-driven science techniques and their increasing use on HPC infrastructure are well aligned with PM purposes said Scott. The emerging HPC system model, which emphasizes memory and data movement as well as intense computation (lots of flops) is a good fit for PM.

NERSC Cray Cori supercomputer at Wang Hall - graphic panels installation - November 09, 2015.
NERSC Cray Cori supercomputer at Wang Hall – graphic panels installation – November 09, 2015.

“Computational demands and algorithm complexity are pushing us to build larger and larger machines, like Cori at NERSC, but they are fortunately pushing us in the direction of broader HPC. Computations tends to get all of the attention, [but] the real way to build a SC today depends upon the memory systems and interconnect,” said Scott. A mixed workload environment is what’s needed and also where supercomputing is trending.

“On the software side common HPC techniques like simulation done on molecular dynamics or finite element analysis or image processing can be brought to bear fairly successfully on PM problems [while similarly] areas like large scale graph analytics and machine learning are also critical.”

Streitz reviewed directions of CANDLE’s three pilot projects (see figure below) one of which seeks to unravel the role of RAS mutations, current in about 30 percent of cancer including some of the toughest, zeroing in how RAS behaves on the cell membrane. RAS is involved in growth and when it gets stuck in the on position, cancer can be the result.

nci_doe_collaborations

Just today, it was announced that NVIDIA will join the project. Here’s an excerpt from the release:

“AI will be essential to achieve the objectives of the Cancer Moonshot,” said Rick Stevens, associate laboratory director for Computing, Environment and Life Sciences at Argonne National Laboratory. “New computing architectures have accelerated the training of neural networks by 50 times in just three years, and we expect more dramatic gains ahead.”

“GPU deep learning has given us a new tool to tackle grand challenges that have, up to now, been too complex for even the most powerful supercomputers,” said Jen-Hsun Huang, founder and chief executive officer, NVIDIA.

“Together with the Department of Energy and the National Cancer Institute, we are creating an AI supercomputing platform for cancer research. This ambitious collaboration is a giant leap in accelerating one of our nation’s greatest undertakings, the fight against cancer.” (See the full release: http://nvidianews.nvidia.com/news/nvidia-teams-with-national-cancer-institute-u-s-department-of-energy-to-create-ai-platform-for-accelerating-cancer-research#sthash.AiT5EhY2.dpuf )

One of the most interesting observations came from Cohen. To some extent PM is trying to capture the knowledge experienced clinicians already have and codify it and make it available. Think of the time required to train a complicate neural network, matching answers to desire outcomes based on experience, as akin to clinical training and experience. Some clinicians still push back against this idea, calling it autonomous medicine that will claim or erode their jobs said Cohen.

This was clearly not his view. It’s also less about how PM can contribute to progress and more about its implementation. Still it suggested creating physician friendly tools and changing physician attitudes is at least a part of the challenge.

Capturing the full scope of the SC16 panel is a tall order. PM is a broad undertaking with many components. The NCI Cancer Moonshot is making progress daily, as demonstrated by today’s NVIDIA announcement. Precision medicine, which depends critically on HPC, matters.

[i] Streitz fiilled for Dimitri Kusnezov, Chief Scientist & Senior Advisor to the Secretary, U.S. Department of Energy, National Nuclear Security Administration, who was stuck in San Francisco because of travel problems.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

At GTC: Nvidia Expands Scope of Its AI and Datacenter Ecosystem

March 19, 2019

In the high-stakes race to provide the AI life-cycle solution of choice, three of the biggest horses in the field are IBM, Intel and Nvidia. While the latter is only a fraction of the size of its two bigger rivals, and h Read more…

By Doug Black

AWS to Offer Nvidia’s T4 GPUs for AI Inferencing

March 19, 2019

The AI inference market is booming, prompting well-known hyperscaler and Nvidia partner Amazon Web Services to offer a new cloud instance that addresses the growing cost of scaling inference. The new “G4” instances... Read more…

By George Leopold

Nvidia Debuts Clara AI Toolkit with Pre-Trained Models for Radiology Use

March 19, 2019

AI’s push into healthcare got a boost yesterday with Nvidia’s release of the Clara Deploy AI toolkit which includes 13 pre-trained models for use in radiology. Clara, you may recall, is Nvidia’s biomedical platform Read more…

By John Russell

HPE Extreme Performance Solutions

HPE and Intel® Omni-Path Architecture: How to Power a Cloud

Learn how HPE and Intel® Omni-Path Architecture provide critical infrastructure for leading Nordic HPC provider’s HPCFLOW cloud service.

powercloud_blog.jpgFor decades, HPE has been at the forefront of high-performance computing, and we’ve powered some of the fastest and most robust supercomputers in the world. Read more…

IBM Accelerated Insights

The Spark That Ignited A New World of Real-Time Analytics

High Performance Computing has always been about Big Data. It’s not uncommon for research datasets to contain millions of files and many terabytes, even petabytes of data, or more. Read more…

DARPA, NSF Seek Real-Time ML Processor

March 18, 2019

A new U.S. research initiative seeks to develop a processor capable of real-time learning while operating with the “efficiency of the human brain.” The National Science Foundation (NSF) and the Defense Advanced Research Projects Agency jointly announced a “Real Time Machine Learning” project on March 15 soliciting industry proposals for “foundational breakthroughs” in hardware required to “build systems that respond and adapt in real time.” Read more…

By George Leopold

At GTC: Nvidia Expands Scope of Its AI and Datacenter Ecosystem

March 19, 2019

In the high-stakes race to provide the AI life-cycle solution of choice, three of the biggest horses in the field are IBM, Intel and Nvidia. While the latter is Read more…

By Doug Black

Nvidia Debuts Clara AI Toolkit with Pre-Trained Models for Radiology Use

March 19, 2019

AI’s push into healthcare got a boost yesterday with Nvidia’s release of the Clara Deploy AI toolkit which includes 13 pre-trained models for use in radiolo Read more…

By John Russell

It’s Official: Aurora on Track to Be First U.S. Exascale Computer in 2021

March 18, 2019

The U.S. Department of Energy along with Intel and Cray confirmed today that an Intel/Cray supercomputer, "Aurora," capable of sustained performance of one exaf Read more…

By Tiffany Trader

Why Nvidia Bought Mellanox: ‘Future Datacenters Will Be…Like High Performance Computers’

March 14, 2019

“Future datacenters of all kinds will be built like high performance computers,” said Nvidia CEO Jensen Huang during a phone briefing on Monday after Nvidia revealed scooping up the high performance networking company Mellanox for $6.9 billion. Read more…

By Tiffany Trader

Oil and Gas Supercloud Clears Out Remaining Knights Landing Inventory: All 38,000 Wafers

March 13, 2019

The McCloud HPC service being built by Australia’s DownUnder GeoSolutions (DUG) outside Houston is set to become the largest oil and gas cloud in the world th Read more…

By Tiffany Trader

Quick Take: Trump’s 2020 Budget Spares DoE-funded HPC but Slams NSF and NIH

March 12, 2019

U.S. President Donald Trump’s 2020 budget request, released yesterday, proposes deep cuts in many science programs but seems to spare HPC funding by the Depar Read more…

By John Russell

Nvidia Wins Mellanox Stakes for $6.9 Billion

March 11, 2019

The long-rumored acquisition of Mellanox came to fruition this morning with GPU chipmaker Nvidia’s announcement that it has purchased the high-performance net Read more…

By Doug Black

Optalysys Rolls Commercial Optical Processor

March 7, 2019

Optalysys, Ltd., a U.K. company seeking to advance it optical co-processor technology, moved a step closer this week with the unveiling of what it claims is th Read more…

By George Leopold

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

ClusterVision in Bankruptcy, Fate Uncertain

February 13, 2019

ClusterVision, European HPC specialists that have built and installed over 20 Top500-ranked systems in their nearly 17-year history, appear to be in the midst o Read more…

By Tiffany Trader

Intel Reportedly in $6B Bid for Mellanox

January 30, 2019

The latest rumors and reports around an acquisition of Mellanox focus on Intel, which has reportedly offered a $6 billion bid for the high performance interconn Read more…

By Doug Black

Looking for Light Reading? NSF-backed ‘Comic Books’ Tackle Quantum Computing

January 28, 2019

Still baffled by quantum computing? How about turning to comic books (graphic novels for the well-read among you) for some clarity and a little humor on QC. The Read more…

By John Russell

Why Nvidia Bought Mellanox: ‘Future Datacenters Will Be…Like High Performance Computers’

March 14, 2019

“Future datacenters of all kinds will be built like high performance computers,” said Nvidia CEO Jensen Huang during a phone briefing on Monday after Nvidia revealed scooping up the high performance networking company Mellanox for $6.9 billion. Read more…

By Tiffany Trader

Contract Signed for New Finnish Supercomputer

December 13, 2018

After the official contract signing yesterday, configuration details were made public for the new BullSequana system that the Finnish IT Center for Science (CSC Read more…

By Tiffany Trader

Deep500: ETH Researchers Introduce New Deep Learning Benchmark for HPC

February 5, 2019

ETH researchers have developed a new deep learning benchmarking environment – Deep500 – they say is “the first distributed and reproducible benchmarking s Read more…

By John Russell

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

IBM Bets $2B Seeking 1000X AI Hardware Performance Boost

February 7, 2019

For now, AI systems are mostly machine learning-based and “narrow” – powerful as they are by today's standards, they're limited to performing a few, narro Read more…

By Doug Black

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

HPC Reflections and (Mostly Hopeful) Predictions

December 19, 2018

So much ‘spaghetti’ gets tossed on walls by the technology community (vendors and researchers) to see what sticks that it is often difficult to peer through Read more…

By John Russell

Arm Unveils Neoverse N1 Platform with up to 128-Cores

February 20, 2019

Following on its Neoverse roadmap announcement last October, Arm today revealed its next-gen Neoverse microarchitecture with compute and throughput-optimized si Read more…

By Tiffany Trader

Move Over Lustre & Spectrum Scale – Here Comes BeeGFS?

November 26, 2018

Is BeeGFS – the parallel file system with European roots – on a path to compete with Lustre and Spectrum Scale worldwide in HPC environments? Frank Herold Read more…

By John Russell

France to Deploy AI-Focused Supercomputer: Jean Zay

January 22, 2019

HPE announced today that it won the contract to build a supercomputer that will drive France’s AI and HPC efforts. The computer will be part of GENCI, the Fre Read more…

By Tiffany Trader

Microsoft to Buy Mellanox?

December 20, 2018

Networking equipment powerhouse Mellanox could be an acquisition target by Microsoft, according to a published report in an Israeli financial publication. Microsoft has reportedly gone so far as to engage Goldman Sachs to handle negotiations with Mellanox. Read more…

By Doug Black

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This