Many patients have difficulty recovering from traumatic brain injuries, with atrophy and shrinking magnifying the impacts of the injuries – and few, if any, effective treatments available to prevent this additional damage. Stem cell therapy has shown promise in various studies, however, and now, researchers are using supercomputers to help investigate this novel treatment.
The research team, based at the McGovern Medical School at the University of Texas Health Science Center at Houston (UTHealth), are working on the hypothesis that the degradation is caused by excessive inflammation – inflammation that can be countered with stem cells from bone marrow. “We know that the more brain tissue you lose, the worse you do in terms of neurocognitive outcomes,” said Charles S. Cox, Jr., director of the pediatric program at the McGovern Medical School, in an interview with Aaron Dubrow of the Texas Advanced Computing Center (TACC). “The idea is to interrupt that process to some degree, so that we have preservation of brain structures.”
Cox and his colleagues completed a first phase of trials to assess safety with pediatric and adult patients in 2014; a second phase began in 2013 (for pediatric patients) and 2016 (for adult cases) to assess whether the treatment successfully preserves brain tissue and function – and if so, how. To track these factors, the researchers used advanced brain imaging. Key among these techniques was diffusion MRI, which uses molecular diffusion within tissues to make precise measurements.
“With diffusion MRI, you actually get numbers and the numbers are quantitative,” said Jenifer Juranek, an associate professor of pediatric surgery at UTHealth. “They tell you whether diffusion is restricted, or whether diffusion happens more readily. In TBI in particular, the post-injury diffusion tends to be higher than it should be. That’s most likely because things are breaking down and you don’t have anything hindering that diffusion anymore.”
Diffusion MRI’s accuracy, however, entails enormous quantities of data that prove difficult to manage on traditional systems. “Each one of the more than 200 datasets we collect off the MRI scanner is about a gigabyte — that’s just raw data,” Juranek said. “Once we put the data in pipelines to look at macrostructure, volume, surface area, and diffusion, it gets huge.”
The researchers turned to TACC to help bridge this computing gap. Juranek and her colleagues have been using the Lonestar5 system for volumetric analysis of white matter and ventricles in the MRIs and the more GPU-heavy Maverick2 system for diffusion analysis. Juranek worked with TACC’s staff as well, including Joe Allen, a TACC research associate who helped her adapt her analyses to supercomputers.
“I honestly could not have performed this work without Joe’s expertise and all of TACC’s resources,” she said. “I think it’s important for other researchers intimidated by HPC to know that we don’t have to be experts in HPC or MPI or multithreaded tasks. We just need to be willing to listen to the advice we’re given by TACC folks to develop and maintain exceptional workflows on TACC resources.”
For the researchers, these resources proved critical, opening new doors for their research.
“If medical research wants to continue to make advances, they need to pair themselves with high performance clusters,” Juranek said. “The information that we’re gathering is so massive that in order to analyze it properly, you’ve got to have access to these kinds of resources.”
Header image: The corpus callosum reconstructed from diffusion MRI data. Image courtesy of Jenifer Juranek.
To read the reporting from TACC’s Aaron Dubrow, click here.