IDC’s Conway Sets Stage for SC16 Precision Medicine Panel

By John Russell

November 4, 2016

Editor’s Note: Kicking off SC this year is what promises to be a fascinating panel – HPC Impacts on Precision Medicine: Life’s Future–The Next Frontier in Healthcare. In this pre-SC16 Q&A, Steve Conway, research vice president in IDC’s High Performance Computing group and moderator of the panel, sets the stage. HPC, of course, has been transforming life sciences and medicine for nearly two decades. The transformation began in research – sequencing the Human Genome was as much an HPC achievement as it was a triumph of new DNA sequencing instruments. HPC has since moved steadily albeit slowly into the clinic. There are even special purpose supercomputers – Anton – dedicated to life sciences.

What’s emerged is this umbrella notion of precision medicine (PM), the idea that it should be possible to leverage all of our hard-won knowledge, new instrument technology, computational power, and a growing wealth of data spanning individuals to populations to deliver more effective therapies, preventative measures, and even lifestyle enhancing choices. That’s a mouthful. Genomics is the most prominently touted enabler but there are many pieces to the PM puzzle.

We are still at the edge of this sea. Only in recent years, for example, have we come to appreciate the influence the microbiome in each of us has on health. Yet while so much remains to be discovered much has been accomplished. Precision medicine is already starting to transform healthcare; in certain cancers, for example, it has proven decisive. Expectations are high. Think of the many recent and ongoing initiatives such as the Brain Initiative, the Cancer Moonshot, the 1000 Genomes project, the Blue Brain Project, and Human Microbiome Project, to name just a few, all of which depend upon advanced computing. It seems fitting for SC to showcase the importance HPC plays in life sciences research and medicine. – John Russell

HPCwire: The fact that Precision Medicine is the opening panel at SC strongly suggests the growing importance of HPC in making PM and basic life science research possible. Recognizing SC is primarily a technology conference, could you frame the goals of this panel?

Steve Conway: Precision medicine, also called personalized medicine, promises to transform medical practice and healthcare spending by enabling called personalized diagnoses and treatment plans that are custom-tuned for each patient’s physiology, symptoms, medical history, DNA and even lifestyle. What constitutes a good outcome for a broken hand may be different for an office worker and a concert violinist. HPC is already playing a key role in early precision medicine initiatives around the world, by speeding up genome sequencing and by making it possible to quickly sift through millions of archived patient records to identify treatments that have had the best success rates for patients closely resembling the patient under investigation. Biology is fast becoming a digital science and healthcare analytics is one of the fastest-growing new market segments for HPC. Precision medicine is happening at the intersection of biology, medical practice, healthcare economics, and data science. The expert panel at SC’16 will explore this emerging domain from these varied perspectives, with special emphasis on the major role HPC has already started to play.

Warren Kibbe, NCI
Warren Kibbe, NCI

This is a pretty august group:

  • Mitchell Cohen, Director of Surgery, Denver Health Medical Center; Professor, University of Colorado School of Medicine.
  • Martha Head, Senior Director, The Noldor; Acting Head, Insights from Data at GlaxoSmithKline Pharmaceuticals
  • Warren Kibbe, Director, Center for Biomedical Informatics and Information Technology (CBIIT); Chief Information Officer; Acting Deputy Director; National Cancer Institute (NCI)
  • Dimitri Kusnezov, Chief Scientist & Senior Advisor to the Secretary, U.S. Department of Energy, National Nuclear Security Administration
  • Steve Scott, Chief Technology Officer, Cray Inc.

HPCwire: Today much what constitutes PM is big data analytics. Within this context: a) what are the key technologies (compute/architectures, storage, informatics, etc) being used, b) what are the big technology challenges/bottlenecks, and c) where do you expect near-term progress?

Conway: We’ll hear more about this from the experts on the panel, but in general the computer technologies being used today to support precision medicine vary from purpose-built supercomputers such as IBM Watson with its advanced natural language capability to Linux clusters with the usual processors and software. One big challenge is getting access to detailed data on large enough patient populations—some big healthcare companies are investing a lot of money today to acquire more data. Another challenge is speed. An important decision-support goal over time is for the computer to spit out efficacy curves for treatment options in near-real time, while the patient is still sitting across from the doctor. Yet another challenge is the state of the data science—there’s a big need for tools that help users understand the data better, including benchmarks to verify that the results are useful.

HPCwire: How significant is the relative lack of HPC expertise and general computational literacy of most clinical physicians and even life scientists generally? The command line is hardly a friendly place for them. What, if anything, should be done to support them and to raise their computational skill level?

Conway: One of the biggest barriers across all of HPC is the C. P. Snow “two cultures” problem, where in the case of HPC you have computer scientists and domain scientists trying to communicate with each other using different languages. In precision medicine you might have HPC vendors talking about integer or floating point operations per second, while the buyers and users want to hear about cancer detections per second. My own opinion is that in precision medicine, to be successful HPC vendors will need to bend more toward the users than the other way around. I don’t think vendors can expect users to make a big effort to become more proficient in HPC. It will be interesting to hear what the panelists at SC’16 have to say about this.

watson.jpgHPCwire: How should we expect delivery of PM technology to evolve? IBM Watson has received a lot of attention using a cloud-like model while many institutions have on-premise resources. How will the PM delivery ecosystem (HPC infrastructure) evolve?

Conway: Again, you’ll get a fuller discussion of this during the SC panel session, but it seems clear that an effective precision medicine environment will involve both on-premise and cloud resources, presumably integrated in a way that’s transparent to users. You’ll need on-premise resources for brute force computing and cloud resources for things including data research, records transfer and general communication. Most healthcare systems already rely on private clouds for communication among providers and between providers and patients. The brute force computing will be needed for near-real time diagnosis and treatment planning.

HPCwire: What are the two or three examples of the most advanced HPC-based PM systems used today and what makes them distinct?

Conway: Let’s start with IBM Watson. In 2011, Watson stunned a huge American television audience by defeating two human past champions of the Jeopardy! game show in a competition match. The great achievement of this digital brain was its ability to “understand” natural language — specifically, natural language expressed in the interrogatory syntax of the game show. On the heels of this triumph, IBM announced in January 2014 that it would invest $1 billion to advance Watson’s decision-making abilities for major commercial markets, including healthcare. Not much later, in May 2015, IBM said 14 U.S. cancer treatment centers had signed on to receive personalized treatment plans selected by a Watson supercomputer. Watson has contracted since Jeopardy! days “from the size of a master bedroom to three stacked pizza boxes.” Watson will parse the DNA of each patient’s cancer and recommend what it considers the optimal medical treatment, so it’s a powerful decision-support tool for healthcare providers.

The Center for Pediatric Genomic Medicine at Children’s Mercy Hospital, Kansas City, Missouri, has been using supercomputer power to help save the lives of critically ill children. In 2010, the center’s work was named one of Time magazine’s top 10 medical breakthroughs. Roughly 4,100 genetic diseases affect humans, and these are the main causes of infant deaths. But identifying which genetic disease is affecting a critically ill child isn’t easy. For one infant suffering from liver failure, the center used 25 hours of supercomputer time to analyze 120 billion nucleotide sequences and narrowed the problem down to two genetic variants. This allowed the doctors to begin treatment with corticosteroids and immunoglobulin. Thanks to this highly accurate diagnosis of the problem and pinpointed treatment, the baby is alive and well today. For 48% of the cases the center works on today, supercomputer-powered genetic diagnosis points the way toward a more effective treatment.

genomics.jpgThe University of Toronto’s SickKids Centre for Computational Medicine uses a supercomputer operating at 107 trillion calculations per second to predict the minute differences between individual children in order to identify the best treatment for each child under their care.

Researchers at the University of Oslo (Norway) are using a supercomputer to help identify the genes that cause bowel and prostate cancer, two common forms of the disease. There are 4,000 new cases of bowel cancer in Norway every year. Only 6 out of 10 patients survive the first five years. Prostate cancer affects 5,000 Norwegians every year and 9 out of 10 patients survive. The researchers are employing the supercomputer to compare the genetic makeup of healthy cells and cancer cells, paying special attention to complex genes called fusion genes.

The Frédéric Joliot Hospital Department (Orsay, France) is using the powerful supercomputer at the French Alternative Energies and Atomic Energy Commission (CEA) in Bruyères-le-Châtel to improve understanding of how tracers used in PET scans for cancer diagnosis distribute themselves through the body. The goals of this research are to optimize PET scan data analysis and, later on, to personalize the PET scan process for each patient in order to produce better outcomes.

Doctors at Australia’s Victor Chang Cardiac Research Institute are using supercomputer-based gaming technology to identify how individuals’ genetic makeups can affect the severity of their heart rhythm diseases. The researchers built a virtual heart, then applied the recorded heartbeats of patients to the digital heart model in order to spot abnormal electrocardiogram signals. The whole process took 10 days using HPC, instead of the 21 years it would have taken with a contemporary personal computer. In other words, this important work would be impractical without the supercomputer.

HPCwire: To a large degree, mechanistic modeling and simulation – beyond compound structure analysis and docking scoring – hasn’t played a large role in the clinic or basic research. Do you think this will change and what will drive the change?

Anton 1 supercomputer specialized for life sciences modeling and simulation
Anton 1 supercomputer specialized for life sciences modeling and simulation

Conway: Modeling and simulation will continue to play a key role in designing a wide array of medical technology products used in clinical practice, from heart pacemakers to diagnostic imaging tools such as MRI and PET scanners. M&S is also crucial for genome sequencing and precision dosing of pharmaceuticals, both of which are important for precision medicine. I think M&S and advanced analytics will go hand-in-hand in this emerging market.

HPCwire: What haven’t I asked that I should?

Conway: Just that precision medicine will be the next market segment IDC adds to the ones we track in our high performance data analysis, or HPDA, practice. Precision medicine will join fraud and anomaly detection, affinity marketing and business intelligence as new segments that are made up mainly of large commercial firms that have adopted HPC for the first time. We forecast that the whole HPDA server and storage market will exceed $5 billion in 2020. Of that amount, about $3.5 billion will come from existing HPC sites and about $1.6 billion will be added to the HPC market by new commercial buyers. Assuming that precision medicine fulfills its promise over the next decade, it is likely to become the single largest market for HPDA, that is, data-intensive computing using HPC resources.

 

steve-conway-idcSteve Conway, is research vice president in IDC’s High Performance Computing group where he plays a major role in directing and implementing HPC research related to the worldwide market for technical servers and supercomputers. He is a 25-year veteran of the HPC and IT industries. Before joining IDC, Conway was vice president of corporate communications and investor relations for Cray, and before that had stints at SGI and CompuServe Corporation.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Quantum Internet: Tsinghua Researchers’ New Memory Framework could be Game-Changer

April 25, 2024

Researchers from the Center for Quantum Information (CQI), Tsinghua University, Beijing, have reported successful development and testing of a new programmable quantum memory framework. “This work provides a promising Read more…

Intel’s Silicon Brain System a Blueprint for Future AI Computing Architectures

April 24, 2024

Intel is releasing a whole arsenal of AI chips and systems hoping something will stick in the market. Its latest entry is a neuromorphic system called Hala Point. The system includes Intel's research chip called Loihi 2, Read more…

Anders Dam Jensen on HPC Sovereignty, Sustainability, and JU Progress

April 23, 2024

The recent 2024 EuroHPC Summit meeting took place in Antwerp, with attendance substantially up since 2023 to 750 participants. HPCwire asked Intersect360 Research senior analyst Steve Conway, who closely tracks HPC, AI, Read more…

AI Saves the Planet this Earth Day

April 22, 2024

Earth Day was originally conceived as a day of reflection. Our planet’s life-sustaining properties are unlike any other celestial body that we’ve observed, and this day of contemplation is meant to provide all of us Read more…

Intel Announces Hala Point – World’s Largest Neuromorphic System for Sustainable AI

April 22, 2024

As we find ourselves on the brink of a technological revolution, the need for efficient and sustainable computing solutions has never been more critical.  A computer system that can mimic the way humans process and s Read more…

Empowering High-Performance Computing for Artificial Intelligence

April 19, 2024

Artificial intelligence (AI) presents some of the most challenging demands in information technology, especially concerning computing power and data movement. As a result of these challenges, high-performance computing Read more…

Quantum Internet: Tsinghua Researchers’ New Memory Framework could be Game-Changer

April 25, 2024

Researchers from the Center for Quantum Information (CQI), Tsinghua University, Beijing, have reported successful development and testing of a new programmable Read more…

Intel’s Silicon Brain System a Blueprint for Future AI Computing Architectures

April 24, 2024

Intel is releasing a whole arsenal of AI chips and systems hoping something will stick in the market. Its latest entry is a neuromorphic system called Hala Poin Read more…

Anders Dam Jensen on HPC Sovereignty, Sustainability, and JU Progress

April 23, 2024

The recent 2024 EuroHPC Summit meeting took place in Antwerp, with attendance substantially up since 2023 to 750 participants. HPCwire asked Intersect360 Resear Read more…

AI Saves the Planet this Earth Day

April 22, 2024

Earth Day was originally conceived as a day of reflection. Our planet’s life-sustaining properties are unlike any other celestial body that we’ve observed, Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

Intel’s Xeon General Manager Talks about Server Chips 

January 2, 2024

Intel is talking data-center growth and is done digging graves for its dead enterprise products, including GPUs, storage, and networking products, which fell to Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire