Collaborative Efforts Produce Clinical Workflows for Fast Genetic Analysis

May 6, 2019

May 6, 2019 — With individualized medicine—one of the holy grails of modern healthcare—diagnosis and treatment of patients would rely in part on each individual’s specific DNA profile, enabling truly personalized care. But in order for genetic information to contribute meaningfully to patient care, DNA testing has to be affordable and efficient. In 2017, the Mayo Clinic Center for Individualized Medicine (CIM) and the University of Illinois at Urbana-Champaign embarked on a two-year Grand Challenge under the auspices of the Mayo Clinic & Illinois Alliance for Technology-Based Healthcare with the goal of making DNA analysis a possibility for every patient. The first aim of the project focused on finding faster methods for clinical analysis of the whole human genome.

The Grand Challenge project, led by Eric Wieben, Ph.D. at Mayo Clinic and Matthew Hudson, Ph.D. at Illinois, tasked Liudmila Mainzer, Ph.D., Technical Program Manager of the Genomics group at Illinois’ National Center for Supercomputing Applications (NCSA), with speeding up clinical testing. Her group conducted analyses to find the fastest tools for genetic variant calling, which analyzes how a specific DNA sample differs from a standard reference. Ultimately, Mayo Clinic decided to adopt a new variant calling software that completes analysis 44 times faster than the traditional industry-standard pipeline—requiring just a few hours to process a whole genome, rather than days. But while faster software makes a significant difference, the bulk of the project lay in the next step for Mainzer’s team: wrapping the newly adopted software tools into a modularized clinical workflow. The resulting “Mayomics” (Mayo + genomics) variant calling workflow will be easy to maintain, update, customize, and run across Mayo Clinic’s many labs and numerous specialized procedures.

Nate Mattson, an IT lead analyst for the Department of IT Executive Administration at Mayo Clinic, coordinated with NCSA and with twelve clinical labs at Mayo Clinic to make sure the finished workflow would meet the needs of hundreds of clinical staff members. Mattson notes that the ability to configure and scale the workflow across multiple procedures and inputs, including whole genome data, is a critical design element—as is automation, which “enables 24/7 processing…without any human intervention” once samples have been sequenced. Modularity—separating tasks into self-contained scripts that can be mixed and matched as needed—is essential on multiple levels, Mainzer adds. “With so many different assays for so many different diseases and conditions, it would be impractical to write, test, and maintain individual workflows for each of them, and keep them in sync and up-to-date as the field evolves. Our design specifically addresses this through modules that can be used in hundreds of workflows, but only need to be updated once when changes occur.”

While the completed workflow satisfied the requirements set forth in the Grand Challenge, Mayo Clinic and Illinois decided to extend their collaborative project in order to add more functionality and configure new workflows, such as variant calling for tumor samples. In the meantime, the first Mayomics workflow has completed a process of rigorous testing by Mayo Clinic’s Software Quality Assurance (SQA) team and is now undergoing a “verification” phase in Mayo Clinic labs prior to official clinical deployment. Clinical work requires robust code and quality control, notes Mainzer, and has to meet exacting external specifications. According to Mattson, “Mayomics will support and exceed all of the auditing requirements set forth by CAP/CLIA and NYS/CLEP,” two sets of national standards for laboratory work.

Mattson and a team comprising Mayo Clinic research IT, clinical IT, and SQA staff work closely with NCSA Genomics to define and clarify requirements for each new request, confirm implementation details, and troubleshoot potential snags. The Mayo Clinic team also develops pieces of the workflow that are highly specific to Mayo Clinic internal systems and procedures. The collaboration works well: “There is value in clinical teams focusing on their own clinical process and medical informatics [while] someone else worries about code organization, workflow development and functionality,” says Mainzer. “These are two different mindsets and it helps when different heads are busy with each one.” Mattson is happy that the Mayomics workflows will enable more efficient, cost-effective analysis. “A diagnosis can be life-changing,” he notes. “Anything we can do to expedite that process without compromising quality is critical.”

This work was a product of the Mayo Clinic & Illinois Alliance for Technology-Based Healthcare. Major funding was provided by the Mayo Clinic Center for Individualized Medicine and the Todd and Karen Wanek Program for Hypoplastic Left Heart Syndrome. The Interdisciplinary Health Sciences Institute, Carl R. Woese Institute for Genomic Biology, and the National Center for Supercomputing Applications also provided support and resources.

ABOUT NCSA

The National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students, and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50 for more than 30 years by bringing industry, researchers, and students together to solve grand challenges at rapid speed and scale.


Source: NCSA

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

U.S. Blacklists Sugon, 4 Others from Access to Advanced Technology

June 21, 2019

Just as ISC19 wrapped up yesterday, showcasing the latest in supercomputing technology, the U.S. added five Chinese entities including Sugon to its blacklist prohibiting them from access to advanced technology vital to s Read more…

By John Russell

Is Weather and Climate Prediction the Perfect ‘Pilot’ for Exascale?

June 21, 2019

At ISC 2019 this week, Peter Bauer – deputy director of research for the European Centre for Medium-Range Weather Forecasts (ECMWF) – outlined an ambitious vision for the future of weather and climate prediction. For Read more…

By Oliver Peckham

ISC Keynote: Thomas Sterling’s Take on Whither HPC

June 20, 2019

Entertaining, insightful, and unafraid to launch the occasional verbal ICBM, HPC pioneer Thomas Sterling delivered his 16th annual closing keynote at ISC yesterday. He explored, among other things: exascale machinations; quantum’s bubbling money pot; Arm’s new HPC viability; Europe’s... Read more…

By John Russell

HPE Extreme Performance Solutions

HPE and Intel® Omni-Path Architecture: How to Power a Cloud

Learn how HPE and Intel® Omni-Path Architecture provide critical infrastructure for leading Nordic HPC provider’s HPCFLOW cloud service.

For decades, HPE has been at the forefront of high-performance computing, and we’ve powered some of the fastest and most robust supercomputers in the world. Read more…

IBM Accelerated Insights

Avoid AI Redo’s by Starting with the Right Infrastructure

Do you know if you have the right infrastructure for AI? Many organizations don’t have it. In a recent IDC survey, “77.1% of respondents say they ran into one or more limitations with their AI infrastructure on-premise and 90.3% ran into compute limitations in the cloud.” Read more…

IBM Claims No. 1 Commercial Supercomputer with Total Oil & Gas System 

June 20, 2019

IBM can now boast not only the two most powerful supercomputers in the world, it also has claimed the top spot for a supercomputer used in a commercial setting. The Pangea III, which leverages IBM's Power9 CPU-GPU archit Read more…

By Staff Report

Is Weather and Climate Prediction the Perfect ‘Pilot’ for Exascale?

June 21, 2019

At ISC 2019 this week, Peter Bauer – deputy director of research for the European Centre for Medium-Range Weather Forecasts (ECMWF) – outlined an ambitious Read more…

By Oliver Peckham

ISC Keynote: Thomas Sterling’s Take on Whither HPC

June 20, 2019

Entertaining, insightful, and unafraid to launch the occasional verbal ICBM, HPC pioneer Thomas Sterling delivered his 16th annual closing keynote at ISC yesterday. He explored, among other things: exascale machinations; quantum’s bubbling money pot; Arm’s new HPC viability; Europe’s... Read more…

By John Russell

IBM Claims No. 1 Commercial Supercomputer with Total Oil & Gas System 

June 20, 2019

IBM can now boast not only the two most powerful supercomputers in the world, it also has claimed the top spot for a supercomputer used in a commercial setting. Read more…

By Staff Report

HPC on Pace for 5-Year 6.8% CAGR; Guess Which Hyperscaler Spent $10B on IT Last Year?

June 20, 2019

In the neck-and-neck horse race for HPC server market share, HPE has hung on to a slim, shrinking lead over Dell EMC – but if server and storage market shares Read more…

By Doug Black

ISC 2019 Research Paper Award Winners Announced

June 19, 2019

At the 2019 International Supercomputing Conference (ISC) in Frankfurt this week, the ISC committee awarded the event's top prizes for outstanding research pape Read more…

By Oliver Peckham

ISC Keynote: The Algorithms of Life – Scientific Computing for Systems Biology

June 19, 2019

Systems biology has existed loosely under many definitions for a couple of decades. It’s the notion of describing living systems using first-principle physics Read more…

By John Russell

Summit Achieves 445 Petaflops on New ‘HPL-AI’ Benchmark

June 19, 2019

Summit -- the world's top-ranking supercomputer -- has been used to test-drive a new mixed-precision Linpack benchmark, which for now is being called HPL-AI. Traditionally, supercomputer performance is measured using the High-Performance Linpack (HPL) benchmark, which is the basis for the Top500 list that biannually ranks world's fastest supercomputers. Read more…

By Oliver Peckham

By the Numbers: For the HPC Industry, These Are the Good Old Days

June 18, 2019

For technology vendors in HPC and HPC-related markets driven by increased demand for AI, enterprise and exascale solutions, this is the best of times – with better times likely in the offing. HPC analyst firm Hyperion Research took the occasion of its semi-annual HPC market update breakfast today in Frankfurt... Read more…

By Doug Black

High Performance (Potato) Chips

May 5, 2006

In this article, we focus on how Procter & Gamble is using high performance computing to create some common, everyday supermarket products. Tom Lange, a 27-year veteran of the company, tells us how P&G models products, processes and production systems for the betterment of consumer package goods. Read more…

By Michael Feldman

Cray, AMD to Extend DOE’s Exascale Frontier

May 7, 2019

Cray and AMD are coming back to Oak Ridge National Laboratory to partner on the world’s largest and most expensive supercomputer. The Department of Energy’s Read more…

By Tiffany Trader

Graphene Surprises Again, This Time for Quantum Computing

May 8, 2019

Graphene is fascinating stuff with promise for use in a seeming endless number of applications. This month researchers from the University of Vienna and Institu Read more…

By John Russell

Why Nvidia Bought Mellanox: ‘Future Datacenters Will Be…Like High Performance Computers’

March 14, 2019

“Future datacenters of all kinds will be built like high performance computers,” said Nvidia CEO Jensen Huang during a phone briefing on Monday after Nvidia revealed scooping up the high performance networking company Mellanox for $6.9 billion. Read more…

By Tiffany Trader

AMD Verifies Its Largest 7nm Chip Design in Ten Hours

June 5, 2019

AMD announced last week that its engineers had successfully executed the first physical verification of its largest 7nm chip design – in just ten hours. The AMD Radeon Instinct Vega20 – which boasts 13.2 billion transistors – was tested using a TSMC-certified Calibre nmDRC software platform from Mentor. Read more…

By Oliver Peckham

It’s Official: Aurora on Track to Be First US Exascale Computer in 2021

March 18, 2019

The U.S. Department of Energy along with Intel and Cray confirmed today that an Intel/Cray supercomputer, "Aurora," capable of sustained performance of one exaf Read more…

By Tiffany Trader

Deep Learning Competitors Stalk Nvidia

May 14, 2019

There is no shortage of processing architectures emerging to accelerate deep learning workloads, with two more options emerging this week to challenge GPU leader Nvidia. First, Intel researchers claimed a new deep learning record for image classification on the ResNet-50 convolutional neural network. Separately, Israeli AI chip startup Hailo.ai... Read more…

By George Leopold

TSMC and Samsung Moving to 5nm; Whither Moore’s Law?

June 12, 2019

With reports that Taiwan Semiconductor Manufacturing Co. (TMSC) and Samsung are moving quickly to 5nm manufacturing, it’s a good time to again ponder whither goes the venerable Moore’s law. Shrinking feature size has of course been the primary hallmark of achieving Moore’s law... Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Nvidia Embraces Arm, Declares Intent to Accelerate All CPU Architectures

June 17, 2019

As the Top500 list was being announced at ISC in Frankfurt today with an upgraded petascale Arm supercomputer in the top third of the list, Nvidia announced its Read more…

By Tiffany Trader

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

Top500 Purely Petaflops; US Maintains Performance Lead

June 17, 2019

With the kick-off of the International Supercomputing Conference (ISC) in Frankfurt this morning, the 53rd Top500 list made its debut, and this one's for petafl Read more…

By Tiffany Trader

Intel Launches Cascade Lake Xeons with Up to 56 Cores

April 2, 2019

At Intel's Data-Centric Innovation Day in San Francisco (April 2), the company unveiled its second-generation Xeon Scalable (Cascade Lake) family and debuted it Read more…

By Tiffany Trader

Cray – and the Cray Brand – to Be Positioned at Tip of HPE’s HPC Spear

May 22, 2019

More so than with most acquisitions of this kind, HPE’s purchase of Cray for $1.3 billion, announced last week, seems to have elements of that overused, often Read more…

By Doug Black and Tiffany Trader

Announcing four new HPC capabilities in Google Cloud Platform

April 15, 2019

When you’re running compute-bound or memory-bound applications for high performance computing or large, data-dependent machine learning training workloads on Read more…

By Wyatt Gorman, HPC Specialist, Google Cloud; Brad Calder, VP of Engineering, Google Cloud; Bart Sano, VP of Platforms, Google Cloud

In Wake of Nvidia-Mellanox: Xilinx to Acquire Solarflare

April 25, 2019

With echoes of Nvidia’s recent acquisition of Mellanox, FPGA maker Xilinx has announced a definitive agreement to acquire Solarflare Communications, provider Read more…

By Doug Black

Nvidia Claims 6000x Speed-Up for Stock Trading Backtest Benchmark

May 13, 2019

A stock trading backtesting algorithm used by hedge funds to simulate trading variants has received a massive, GPU-based performance boost, according to Nvidia, Read more…

By Doug Black

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This