Stampede Simulations Show Better Way to Predict Blood Clots

January 30, 2017

The heart is a wonder of design – a pump that can function for 80 years, and billions of heartbeats, without breaking down. But when it does malfunction, the results can be dire.

In research reported in the International Journal of Cardiology this month, scientists from Johns Hopkins University and Ohio State University presented a new method for predicting those most at risk for thrombus, or blood clots, in the heart.

A hemodynamic profile of a patient with a history of left ventricular thrombus (blood clotting) derived from computational fluid dynamic modeling. Credit: Rajat Mittal, Jung Hee Seo and Thura Harfi

The critical factor, the researchers found, is the degree to which the mitral jet – a stream of blood shot through the mitral valve – penetrates into the left ventricle of the heart. If the jet doesn’t travel deep enough into the ventricle, it can prevent the heart from properly flushing blood from the chamber, potentially leading to clots, strokes and other dangerous consequences.

The findings were based on simulations performed using the Stampede supercomputer at the Texas Advanced Computing Center and validated using data from patients who both did and did not experience post-heart attack blood clots. The work was supported by a grant from the National Science Foundation.

The metric that characterizes the jet penetration, which the researchers dub the E-wave propagation index (EPI), can be ascertained using standard diagnostic tools and clinical procedures that are currently used to assess patient risk of clot formation, but is much more accurate than current methods.

“The beauty of the index is that it doesn’t require any additional measurements. It simply reformulates echocardiogram data into a new metric,” said Rajat Mittal, a computational fluid dynamics expert and professor of mechanical engineering at Johns Hopkins University and one of the principal investigators on the research. “The clinician doesn’t have to do any additional work.”

Heart disease is the leading cause of death in the U.S. and by far the most expensive disease in terms of health care costs. Heart attacks cause some deaths; others result from blood clots, frequently the result of a heart weakened by disease or a traumatic injury.

Clots can occur whenever blood remains stagnant. Since the chambers of the heart are the largest reservoirs of blood in the body, they are the areas most at risk for generating clots.

Predicting when a patient is in danger of developing a blood clot is challenging for physicians. Patients recovering from a heart attack are frequently given anticoagulant drugs to prevent clotting, but these drugs have adverse side-effects.

Cardiologists currently use the ejection fraction – the percentage of blood flushed from the heart with each beat – as well as a few other factors, to predict which patients are at risk of a future clot.

For healthy individuals, 55 to 70 percent of the volume of the chamber is ejected out of the left ventricle with every heartbeat. For those with heart conditions, the ejection fraction can be reduced to as low as 15 percent and the risk of stagnation rises dramatically.

Though an important factor, the ejection fraction does not appear to be an accurate predictor of future clotting risk.

Computational fluid dynamics results show that the mitral jet propagates towards the apex mainly during the E-wave. A mitral jet that propagates further towards the apex during the E-wave will produce significant apical washout. Thus, the propagation distance of the mitral jet into the left ventricle by the end of the E-wave indexed by the length of the left ventricle should correlate well with apical “washout,” and therefore, with left ventricle thrombus risk.

“Because we understood the fluid dynamics in the heart using our computational models, we reached the conclusion that the ejection fraction is not a very accurate measure of flow stasis in the left ventricle,” Mittal said. “We showed very clearly that the ejection fraction is not able to differentiate a large fraction of these patient and stratify risk, whereas this E-wave propagation index can very accurately stratify who will get a clot and who will not,” he said.

The results were the culmination of many years of investigation by Mittal and his collaborators into the fundamental relationship between the structure and function of the heart. To arrive at their hypothesis, the researchers captured detailed measurements from 13 patients and used those to construct high-fidelity, patient-specific models of the heart that take into account fluid flow, physical structures and bio-chemistry.

These models led, in turn, to new insights into the factors that correlate most closely to stagnation in the left ventricle, chief among them, mitral jet penetration.

Working in collaboration with clinicians, including lead author, Thura Harfi of Ohio State University, the team tested their hypothesis using data from 75 individual — 25 healthy patients, 25 patients who experienced clots in their left ventricle, and 25 patients who had a compromised heart but who didn’t have any clots.

Pending validation in a larger cohort of patients, the researchers found that based on the EPI measurement, one in every five patients with severe cardiomyopathy who are currently not being treated with anticoagulation, would be at risk of a left ventricular clot and would benefit from anticoagulation.

“Physicians and engineers don’t interact as often as they should and that creates a knowledge gap that can be closed with this type of collaborative research,” Harfi said. “Computational fluid dynamics is such an established way of studying phenomena in mechanical engineering, but has rarely been tried in humans. But now, with the development of high-resolution cardiac imaging techniques like cardiac computed tomography (CT) and the availability of supercomputing power, we can apply the power of computational fluid dynamics simulations to study blood flow in human beings. The information you get from a computer simulation you cannot get otherwise.”

Mittal and his team required large computing resources to derive and test their hypothesis. Each simulation ran in parallel on 256 to 512 processors and took several 100,000 computing hours to complete.

“This work cannot be done by simulating a single case. Having a large enough sample size to base conclusions on was essential for this research,” Mittal said. “We could never come close to being able to do what we needed to do it if weren’t for Stampede.”

Time on Stampede was provided through the Extreme Science and Engineering Discovery Environment (XSEDE).

Mittal foresees a time where doctors will perform patient-specific heart simulations routinely to determine the best course of treatment. However, hospitals would need systems hundreds of times faster than a current desktop computer to be able to figure out a solution locally in a reasonable timeframe.

In addition to establishing the new diagnostic tool for clinicians, Mittal’s research helps advance new, efficient computational models that will be necessary to make patient-specific diagnostics feasible.

The team plans to continue to test their hypothesis, applying the EPI metric to a larger dataset. They hope in the future to run a clinical study with prospective, rather than retrospective, analysis.

With a better understanding of the mechanics of blood clots and ways to the predict them, the researchers have turned their attention to other sources of blood clots, including bio-prosthetic heart valves and atrial fibrillation (AFib) – a quivering or irregular heartbeat that affects 2.7 million Americans.

“These research results are an important first step to move our basic scientific understanding of the physics of how blood flows in the heart to real-time predictions and treatments for the well-being of patients,” said Ronald Joslin, NSF Fluid Dynamics program director.

“The potential for impact in this area is very motivating,” Mittal said, “not just for me but for my collaborators, students and post-docs as well.”


Source:  Aaron Dubrow, Texas Advanced Computing Center (TACC)

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pressing needs and hurdles to widespread AI adoption. The sudde Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire