IDC’s Conway Sets Stage for SC16 Precision Medicine Panel

By John Russell

November 4, 2016

Editor’s Note: Kicking off SC this year is what promises to be a fascinating panel – HPC Impacts on Precision Medicine: Life’s Future–The Next Frontier in Healthcare. In this pre-SC16 Q&A, Steve Conway, research vice president in IDC’s High Performance Computing group and moderator of the panel, sets the stage. HPC, of course, has been transforming life sciences and medicine for nearly two decades. The transformation began in research – sequencing the Human Genome was as much an HPC achievement as it was a triumph of new DNA sequencing instruments. HPC has since moved steadily albeit slowly into the clinic. There are even special purpose supercomputers – Anton – dedicated to life sciences.

What’s emerged is this umbrella notion of precision medicine (PM), the idea that it should be possible to leverage all of our hard-won knowledge, new instrument technology, computational power, and a growing wealth of data spanning individuals to populations to deliver more effective therapies, preventative measures, and even lifestyle enhancing choices. That’s a mouthful. Genomics is the most prominently touted enabler but there are many pieces to the PM puzzle.

We are still at the edge of this sea. Only in recent years, for example, have we come to appreciate the influence the microbiome in each of us has on health. Yet while so much remains to be discovered much has been accomplished. Precision medicine is already starting to transform healthcare; in certain cancers, for example, it has proven decisive. Expectations are high. Think of the many recent and ongoing initiatives such as the Brain Initiative, the Cancer Moonshot, the 1000 Genomes project, the Blue Brain Project, and Human Microbiome Project, to name just a few, all of which depend upon advanced computing. It seems fitting for SC to showcase the importance HPC plays in life sciences research and medicine. – John Russell

HPCwire: The fact that Precision Medicine is the opening panel at SC strongly suggests the growing importance of HPC in making PM and basic life science research possible. Recognizing SC is primarily a technology conference, could you frame the goals of this panel?

Steve Conway: Precision medicine, also called personalized medicine, promises to transform medical practice and healthcare spending by enabling called personalized diagnoses and treatment plans that are custom-tuned for each patient’s physiology, symptoms, medical history, DNA and even lifestyle. What constitutes a good outcome for a broken hand may be different for an office worker and a concert violinist. HPC is already playing a key role in early precision medicine initiatives around the world, by speeding up genome sequencing and by making it possible to quickly sift through millions of archived patient records to identify treatments that have had the best success rates for patients closely resembling the patient under investigation. Biology is fast becoming a digital science and healthcare analytics is one of the fastest-growing new market segments for HPC. Precision medicine is happening at the intersection of biology, medical practice, healthcare economics, and data science. The expert panel at SC’16 will explore this emerging domain from these varied perspectives, with special emphasis on the major role HPC has already started to play.

Warren Kibbe, NCI
Warren Kibbe, NCI

This is a pretty august group:

  • Mitchell Cohen, Director of Surgery, Denver Health Medical Center; Professor, University of Colorado School of Medicine.
  • Martha Head, Senior Director, The Noldor; Acting Head, Insights from Data at GlaxoSmithKline Pharmaceuticals
  • Warren Kibbe, Director, Center for Biomedical Informatics and Information Technology (CBIIT); Chief Information Officer; Acting Deputy Director; National Cancer Institute (NCI)
  • Dimitri Kusnezov, Chief Scientist & Senior Advisor to the Secretary, U.S. Department of Energy, National Nuclear Security Administration
  • Steve Scott, Chief Technology Officer, Cray Inc.

HPCwire: Today much what constitutes PM is big data analytics. Within this context: a) what are the key technologies (compute/architectures, storage, informatics, etc) being used, b) what are the big technology challenges/bottlenecks, and c) where do you expect near-term progress?

Conway: We’ll hear more about this from the experts on the panel, but in general the computer technologies being used today to support precision medicine vary from purpose-built supercomputers such as IBM Watson with its advanced natural language capability to Linux clusters with the usual processors and software. One big challenge is getting access to detailed data on large enough patient populations—some big healthcare companies are investing a lot of money today to acquire more data. Another challenge is speed. An important decision-support goal over time is for the computer to spit out efficacy curves for treatment options in near-real time, while the patient is still sitting across from the doctor. Yet another challenge is the state of the data science—there’s a big need for tools that help users understand the data better, including benchmarks to verify that the results are useful.

HPCwire: How significant is the relative lack of HPC expertise and general computational literacy of most clinical physicians and even life scientists generally? The command line is hardly a friendly place for them. What, if anything, should be done to support them and to raise their computational skill level?

Conway: One of the biggest barriers across all of HPC is the C. P. Snow “two cultures” problem, where in the case of HPC you have computer scientists and domain scientists trying to communicate with each other using different languages. In precision medicine you might have HPC vendors talking about integer or floating point operations per second, while the buyers and users want to hear about cancer detections per second. My own opinion is that in precision medicine, to be successful HPC vendors will need to bend more toward the users than the other way around. I don’t think vendors can expect users to make a big effort to become more proficient in HPC. It will be interesting to hear what the panelists at SC’16 have to say about this.

watson.jpgHPCwire: How should we expect delivery of PM technology to evolve? IBM Watson has received a lot of attention using a cloud-like model while many institutions have on-premise resources. How will the PM delivery ecosystem (HPC infrastructure) evolve?

Conway: Again, you’ll get a fuller discussion of this during the SC panel session, but it seems clear that an effective precision medicine environment will involve both on-premise and cloud resources, presumably integrated in a way that’s transparent to users. You’ll need on-premise resources for brute force computing and cloud resources for things including data research, records transfer and general communication. Most healthcare systems already rely on private clouds for communication among providers and between providers and patients. The brute force computing will be needed for near-real time diagnosis and treatment planning.

HPCwire: What are the two or three examples of the most advanced HPC-based PM systems used today and what makes them distinct?

Conway: Let’s start with IBM Watson. In 2011, Watson stunned a huge American television audience by defeating two human past champions of the Jeopardy! game show in a competition match. The great achievement of this digital brain was its ability to “understand” natural language — specifically, natural language expressed in the interrogatory syntax of the game show. On the heels of this triumph, IBM announced in January 2014 that it would invest $1 billion to advance Watson’s decision-making abilities for major commercial markets, including healthcare. Not much later, in May 2015, IBM said 14 U.S. cancer treatment centers had signed on to receive personalized treatment plans selected by a Watson supercomputer. Watson has contracted since Jeopardy! days “from the size of a master bedroom to three stacked pizza boxes.” Watson will parse the DNA of each patient’s cancer and recommend what it considers the optimal medical treatment, so it’s a powerful decision-support tool for healthcare providers.

The Center for Pediatric Genomic Medicine at Children’s Mercy Hospital, Kansas City, Missouri, has been using supercomputer power to help save the lives of critically ill children. In 2010, the center’s work was named one of Time magazine’s top 10 medical breakthroughs. Roughly 4,100 genetic diseases affect humans, and these are the main causes of infant deaths. But identifying which genetic disease is affecting a critically ill child isn’t easy. For one infant suffering from liver failure, the center used 25 hours of supercomputer time to analyze 120 billion nucleotide sequences and narrowed the problem down to two genetic variants. This allowed the doctors to begin treatment with corticosteroids and immunoglobulin. Thanks to this highly accurate diagnosis of the problem and pinpointed treatment, the baby is alive and well today. For 48% of the cases the center works on today, supercomputer-powered genetic diagnosis points the way toward a more effective treatment.

genomics.jpgThe University of Toronto’s SickKids Centre for Computational Medicine uses a supercomputer operating at 107 trillion calculations per second to predict the minute differences between individual children in order to identify the best treatment for each child under their care.

Researchers at the University of Oslo (Norway) are using a supercomputer to help identify the genes that cause bowel and prostate cancer, two common forms of the disease. There are 4,000 new cases of bowel cancer in Norway every year. Only 6 out of 10 patients survive the first five years. Prostate cancer affects 5,000 Norwegians every year and 9 out of 10 patients survive. The researchers are employing the supercomputer to compare the genetic makeup of healthy cells and cancer cells, paying special attention to complex genes called fusion genes.

The Frédéric Joliot Hospital Department (Orsay, France) is using the powerful supercomputer at the French Alternative Energies and Atomic Energy Commission (CEA) in Bruyères-le-Châtel to improve understanding of how tracers used in PET scans for cancer diagnosis distribute themselves through the body. The goals of this research are to optimize PET scan data analysis and, later on, to personalize the PET scan process for each patient in order to produce better outcomes.

Doctors at Australia’s Victor Chang Cardiac Research Institute are using supercomputer-based gaming technology to identify how individuals’ genetic makeups can affect the severity of their heart rhythm diseases. The researchers built a virtual heart, then applied the recorded heartbeats of patients to the digital heart model in order to spot abnormal electrocardiogram signals. The whole process took 10 days using HPC, instead of the 21 years it would have taken with a contemporary personal computer. In other words, this important work would be impractical without the supercomputer.

HPCwire: To a large degree, mechanistic modeling and simulation – beyond compound structure analysis and docking scoring – hasn’t played a large role in the clinic or basic research. Do you think this will change and what will drive the change?

Anton 1 supercomputer specialized for life sciences modeling and simulation
Anton 1 supercomputer specialized for life sciences modeling and simulation

Conway: Modeling and simulation will continue to play a key role in designing a wide array of medical technology products used in clinical practice, from heart pacemakers to diagnostic imaging tools such as MRI and PET scanners. M&S is also crucial for genome sequencing and precision dosing of pharmaceuticals, both of which are important for precision medicine. I think M&S and advanced analytics will go hand-in-hand in this emerging market.

HPCwire: What haven’t I asked that I should?

Conway: Just that precision medicine will be the next market segment IDC adds to the ones we track in our high performance data analysis, or HPDA, practice. Precision medicine will join fraud and anomaly detection, affinity marketing and business intelligence as new segments that are made up mainly of large commercial firms that have adopted HPC for the first time. We forecast that the whole HPDA server and storage market will exceed $5 billion in 2020. Of that amount, about $3.5 billion will come from existing HPC sites and about $1.6 billion will be added to the HPC market by new commercial buyers. Assuming that precision medicine fulfills its promise over the next decade, it is likely to become the single largest market for HPDA, that is, data-intensive computing using HPC resources.

 

steve-conway-idcSteve Conway, is research vice president in IDC’s High Performance Computing group where he plays a major role in directing and implementing HPC research related to the worldwide market for technical servers and supercomputers. He is a 25-year veteran of the HPC and IT industries. Before joining IDC, Conway was vice president of corporate communications and investor relations for Cray, and before that had stints at SGI and CompuServe Corporation.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire