IDC’s Conway Sets Stage for SC16 Precision Medicine Panel

By John Russell

November 4, 2016

Editor’s Note: Kicking off SC this year is what promises to be a fascinating panel – HPC Impacts on Precision Medicine: Life’s Future–The Next Frontier in Healthcare. In this pre-SC16 Q&A, Steve Conway, research vice president in IDC’s High Performance Computing group and moderator of the panel, sets the stage. HPC, of course, has been transforming life sciences and medicine for nearly two decades. The transformation began in research – sequencing the Human Genome was as much an HPC achievement as it was a triumph of new DNA sequencing instruments. HPC has since moved steadily albeit slowly into the clinic. There are even special purpose supercomputers – Anton – dedicated to life sciences.

What’s emerged is this umbrella notion of precision medicine (PM), the idea that it should be possible to leverage all of our hard-won knowledge, new instrument technology, computational power, and a growing wealth of data spanning individuals to populations to deliver more effective therapies, preventative measures, and even lifestyle enhancing choices. That’s a mouthful. Genomics is the most prominently touted enabler but there are many pieces to the PM puzzle.

We are still at the edge of this sea. Only in recent years, for example, have we come to appreciate the influence the microbiome in each of us has on health. Yet while so much remains to be discovered much has been accomplished. Precision medicine is already starting to transform healthcare; in certain cancers, for example, it has proven decisive. Expectations are high. Think of the many recent and ongoing initiatives such as the Brain Initiative, the Cancer Moonshot, the 1000 Genomes project, the Blue Brain Project, and Human Microbiome Project, to name just a few, all of which depend upon advanced computing. It seems fitting for SC to showcase the importance HPC plays in life sciences research and medicine. – John Russell

HPCwire: The fact that Precision Medicine is the opening panel at SC strongly suggests the growing importance of HPC in making PM and basic life science research possible. Recognizing SC is primarily a technology conference, could you frame the goals of this panel?

Steve Conway: Precision medicine, also called personalized medicine, promises to transform medical practice and healthcare spending by enabling called personalized diagnoses and treatment plans that are custom-tuned for each patient’s physiology, symptoms, medical history, DNA and even lifestyle. What constitutes a good outcome for a broken hand may be different for an office worker and a concert violinist. HPC is already playing a key role in early precision medicine initiatives around the world, by speeding up genome sequencing and by making it possible to quickly sift through millions of archived patient records to identify treatments that have had the best success rates for patients closely resembling the patient under investigation. Biology is fast becoming a digital science and healthcare analytics is one of the fastest-growing new market segments for HPC. Precision medicine is happening at the intersection of biology, medical practice, healthcare economics, and data science. The expert panel at SC’16 will explore this emerging domain from these varied perspectives, with special emphasis on the major role HPC has already started to play.

Warren Kibbe, NCI
Warren Kibbe, NCI

This is a pretty august group:

  • Mitchell Cohen, Director of Surgery, Denver Health Medical Center; Professor, University of Colorado School of Medicine.
  • Martha Head, Senior Director, The Noldor; Acting Head, Insights from Data at GlaxoSmithKline Pharmaceuticals
  • Warren Kibbe, Director, Center for Biomedical Informatics and Information Technology (CBIIT); Chief Information Officer; Acting Deputy Director; National Cancer Institute (NCI)
  • Dimitri Kusnezov, Chief Scientist & Senior Advisor to the Secretary, U.S. Department of Energy, National Nuclear Security Administration
  • Steve Scott, Chief Technology Officer, Cray Inc.

HPCwire: Today much what constitutes PM is big data analytics. Within this context: a) what are the key technologies (compute/architectures, storage, informatics, etc) being used, b) what are the big technology challenges/bottlenecks, and c) where do you expect near-term progress?

Conway: We’ll hear more about this from the experts on the panel, but in general the computer technologies being used today to support precision medicine vary from purpose-built supercomputers such as IBM Watson with its advanced natural language capability to Linux clusters with the usual processors and software. One big challenge is getting access to detailed data on large enough patient populations—some big healthcare companies are investing a lot of money today to acquire more data. Another challenge is speed. An important decision-support goal over time is for the computer to spit out efficacy curves for treatment options in near-real time, while the patient is still sitting across from the doctor. Yet another challenge is the state of the data science—there’s a big need for tools that help users understand the data better, including benchmarks to verify that the results are useful.

HPCwire: How significant is the relative lack of HPC expertise and general computational literacy of most clinical physicians and even life scientists generally? The command line is hardly a friendly place for them. What, if anything, should be done to support them and to raise their computational skill level?

Conway: One of the biggest barriers across all of HPC is the C. P. Snow “two cultures” problem, where in the case of HPC you have computer scientists and domain scientists trying to communicate with each other using different languages. In precision medicine you might have HPC vendors talking about integer or floating point operations per second, while the buyers and users want to hear about cancer detections per second. My own opinion is that in precision medicine, to be successful HPC vendors will need to bend more toward the users than the other way around. I don’t think vendors can expect users to make a big effort to become more proficient in HPC. It will be interesting to hear what the panelists at SC’16 have to say about this.

watson.jpgHPCwire: How should we expect delivery of PM technology to evolve? IBM Watson has received a lot of attention using a cloud-like model while many institutions have on-premise resources. How will the PM delivery ecosystem (HPC infrastructure) evolve?

Conway: Again, you’ll get a fuller discussion of this during the SC panel session, but it seems clear that an effective precision medicine environment will involve both on-premise and cloud resources, presumably integrated in a way that’s transparent to users. You’ll need on-premise resources for brute force computing and cloud resources for things including data research, records transfer and general communication. Most healthcare systems already rely on private clouds for communication among providers and between providers and patients. The brute force computing will be needed for near-real time diagnosis and treatment planning.

HPCwire: What are the two or three examples of the most advanced HPC-based PM systems used today and what makes them distinct?

Conway: Let’s start with IBM Watson. In 2011, Watson stunned a huge American television audience by defeating two human past champions of the Jeopardy! game show in a competition match. The great achievement of this digital brain was its ability to “understand” natural language — specifically, natural language expressed in the interrogatory syntax of the game show. On the heels of this triumph, IBM announced in January 2014 that it would invest $1 billion to advance Watson’s decision-making abilities for major commercial markets, including healthcare. Not much later, in May 2015, IBM said 14 U.S. cancer treatment centers had signed on to receive personalized treatment plans selected by a Watson supercomputer. Watson has contracted since Jeopardy! days “from the size of a master bedroom to three stacked pizza boxes.” Watson will parse the DNA of each patient’s cancer and recommend what it considers the optimal medical treatment, so it’s a powerful decision-support tool for healthcare providers.

The Center for Pediatric Genomic Medicine at Children’s Mercy Hospital, Kansas City, Missouri, has been using supercomputer power to help save the lives of critically ill children. In 2010, the center’s work was named one of Time magazine’s top 10 medical breakthroughs. Roughly 4,100 genetic diseases affect humans, and these are the main causes of infant deaths. But identifying which genetic disease is affecting a critically ill child isn’t easy. For one infant suffering from liver failure, the center used 25 hours of supercomputer time to analyze 120 billion nucleotide sequences and narrowed the problem down to two genetic variants. This allowed the doctors to begin treatment with corticosteroids and immunoglobulin. Thanks to this highly accurate diagnosis of the problem and pinpointed treatment, the baby is alive and well today. For 48% of the cases the center works on today, supercomputer-powered genetic diagnosis points the way toward a more effective treatment.

genomics.jpgThe University of Toronto’s SickKids Centre for Computational Medicine uses a supercomputer operating at 107 trillion calculations per second to predict the minute differences between individual children in order to identify the best treatment for each child under their care.

Researchers at the University of Oslo (Norway) are using a supercomputer to help identify the genes that cause bowel and prostate cancer, two common forms of the disease. There are 4,000 new cases of bowel cancer in Norway every year. Only 6 out of 10 patients survive the first five years. Prostate cancer affects 5,000 Norwegians every year and 9 out of 10 patients survive. The researchers are employing the supercomputer to compare the genetic makeup of healthy cells and cancer cells, paying special attention to complex genes called fusion genes.

The Frédéric Joliot Hospital Department (Orsay, France) is using the powerful supercomputer at the French Alternative Energies and Atomic Energy Commission (CEA) in Bruyères-le-Châtel to improve understanding of how tracers used in PET scans for cancer diagnosis distribute themselves through the body. The goals of this research are to optimize PET scan data analysis and, later on, to personalize the PET scan process for each patient in order to produce better outcomes.

Doctors at Australia’s Victor Chang Cardiac Research Institute are using supercomputer-based gaming technology to identify how individuals’ genetic makeups can affect the severity of their heart rhythm diseases. The researchers built a virtual heart, then applied the recorded heartbeats of patients to the digital heart model in order to spot abnormal electrocardiogram signals. The whole process took 10 days using HPC, instead of the 21 years it would have taken with a contemporary personal computer. In other words, this important work would be impractical without the supercomputer.

HPCwire: To a large degree, mechanistic modeling and simulation – beyond compound structure analysis and docking scoring – hasn’t played a large role in the clinic or basic research. Do you think this will change and what will drive the change?

Anton 1 supercomputer specialized for life sciences modeling and simulation
Anton 1 supercomputer specialized for life sciences modeling and simulation

Conway: Modeling and simulation will continue to play a key role in designing a wide array of medical technology products used in clinical practice, from heart pacemakers to diagnostic imaging tools such as MRI and PET scanners. M&S is also crucial for genome sequencing and precision dosing of pharmaceuticals, both of which are important for precision medicine. I think M&S and advanced analytics will go hand-in-hand in this emerging market.

HPCwire: What haven’t I asked that I should?

Conway: Just that precision medicine will be the next market segment IDC adds to the ones we track in our high performance data analysis, or HPDA, practice. Precision medicine will join fraud and anomaly detection, affinity marketing and business intelligence as new segments that are made up mainly of large commercial firms that have adopted HPC for the first time. We forecast that the whole HPDA server and storage market will exceed $5 billion in 2020. Of that amount, about $3.5 billion will come from existing HPC sites and about $1.6 billion will be added to the HPC market by new commercial buyers. Assuming that precision medicine fulfills its promise over the next decade, it is likely to become the single largest market for HPDA, that is, data-intensive computing using HPC resources.

 

steve-conway-idcSteve Conway, is research vice president in IDC’s High Performance Computing group where he plays a major role in directing and implementing HPC research related to the worldwide market for technical servers and supercomputers. He is a 25-year veteran of the HPC and IT industries. Before joining IDC, Conway was vice president of corporate communications and investor relations for Cray, and before that had stints at SGI and CompuServe Corporation.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Pfizer HPC Engineer Aims to Automate Software Stack Testing

January 17, 2019

Seeking to reign in the tediousness of manual software testing, Pfizer HPC Engineer Shahzeb Siddiqui is developing an open source software tool called buildtest, aimed at automating software stack testing by providing the community with a central repository of tests for common HPC apps and the ability to automate execution of testing. Read more…

By Tiffany Trader

Senegal Prepares to Take Delivery of Atos Supercomputer

January 16, 2019

In just a few months time, Senegal will be operating the second largest HPC system in sub-Saharan Africa. The Minister of Higher Education, Research and Innovation Mary Teuw Niane made the announcement on Monday (Jan. 14 Read more…

By Tiffany Trader

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three big public cloud vendors has by turn touted the latest and Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

HPE Systems With Intel Omni-Path: Architected for Value and Accessible High-Performance Computing

Today’s high-performance computing (HPC) and artificial intelligence (AI) users value high performing clusters. And the higher the performance that their system can deliver, the better. Read more…

IBM Accelerated Insights

Resource Management in the Age of Artificial Intelligence

New challenges demand fresh approaches

Fueled by GPUs, big data, and rapid advances in software, the AI revolution is upon us. Read more…

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchmark or suite of benchmarking tools to compare the performanc Read more…

By John Russell

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three Read more…

By Tiffany Trader

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchm Read more…

By John Russell

A Big Data Journey While Seeking to Catalog our Universe

January 16, 2019

It turns out, astronomers have lots of photos of the sky but seek knowledge about what the photos mean. Sound familiar? Big data problems are often characterize Read more…

By James Reinders

Intel Bets Big on 2-Track Quantum Strategy

January 15, 2019

Quantum computing has lived so long in the future it’s taken on a futuristic life of its own, with a Gartner-style hype cycle that includes triggers of innovation, inflated expectations and – though a useful quantum system is still years away – anticipatory troughs of disillusionment. Read more…

By Doug Black

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

IBM’s New Global Weather Forecasting System Runs on GPUs

January 9, 2019

Anyone who has checked a forecast to decide whether or not to pack an umbrella knows that weather prediction can be a mercurial endeavor. It is a Herculean task: the constant modeling of incredibly complex systems to a high degree of accuracy at a local level within very short spans of time. Read more…

By Oliver Peckham

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

AMD Sets Up for Epyc Epoch

November 16, 2018

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

Contract Signed for New Finnish Supercomputer

December 13, 2018

After the official contract signing yesterday, configuration details were made public for the new BullSequana system that the Finnish IT Center for Science (CSC Read more…

By Tiffany Trader

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

HPC Reflections and (Mostly Hopeful) Predictions

December 19, 2018

So much ‘spaghetti’ gets tossed on walls by the technology community (vendors and researchers) to see what sticks that it is often difficult to peer through Read more…

By John Russell

Intel Confirms 48-Core Cascade Lake-AP for 2019

November 4, 2018

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

Microsoft to Buy Mellanox?

December 20, 2018

Networking equipment powerhouse Mellanox could be an acquisition target by Microsoft, according to a published report in an Israeli financial publication. Microsoft has reportedly gone so far as to engage Goldman Sachs to handle negotiations with Mellanox. Read more…

By Doug Black

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This