by Karen Green, NCSA Senior Writer
Champaign, IL — Scientific research is often a game of strategy, with research teams constantly devising new ways to outmaneuver the challenges that inhibit their progress. Every new tool, every improved process is a chance to gain a competitive advantage. Algorithms that can better analyze field data, codes that promise to speed up the analysis of datasets, new visualization techniques — all are eagerly put to the test. Cancer researchers, for example, covet tools and techniques that can help them deal with large volumes of data from human subjects. The best of these tools become part of the best strategies and methods used to help medical science get a competitive edge on a formidable opponent: cancer.
The research team led by Kenneth Watkin at the University of Illinois at Urbana-Champaign is typical of cancer research teams. Watkin, a U of I professor of medicine and applied life studies, is one of two principal investigators on a project that aims to understand the content of ultrasonic images taken of tissue in cancer patients. His co-PI is Tanya Gallagher, dean of the U of I College of Applied Life Studies. Watkin realized he needed more computing power to analyze his group’s research data, so he began looking for solutions — a tool or a process that could cut the team’s computing time. When he turned to NCSA and learned that the center’s Origin2000 supercomputer could meet his data analysis challenges, he seized the opportunity.
“When we started our work, we were using an 800 MHz desktop computer, and it took at least several hours to process one ultrasonic image,” says Watkin. “We needed to process 400 to 600 images a year, and there was just no way we were going to accomplish that. We needed something faster-something that could process medical images at very high speeds.”
Help came in the form of Faisal Saied and Sirpa Saarinen in NCSA’s Performance Engineering group. They developed a parallel version of the team’s algorithm that analyzes textures in ultrasonic images and ported it to the Origin2000. The researchers received an allotment of time on the Origin2000 and now, a five- to six-hour computing and imaging process can be completed in about five minutes. The collaboration, says Watkin, means he is free to concentrate on the science of his research, knowing that the computational aspects are being handled.
Watkin’s team looks at tissue changes in the tongues of cancer patients, determining how much of the tissue is muscle and fat and how much is a stiff fibrous tissue normally not present in healthy individuals. Healthy tongue tissue is mostly muscle with some fatty tissue, says Watkin. However, in patients who receive radiation treatments for head or neck cancer, some of that muscle tissue often becomes fibrous and inflexible, a process known as fibrosis. The tissue changes can cause other problems in the patient such as dysphagia, or difficulty swallowing. In the most serious dysphagia cases, the patient is unable to swallow food and liquids often get rerouted to the windpipe, causing breathing problems. Sometimes swallowing becomes so difficult that the patient needs a feeding tube.
“When people are radiated, muscle tissue is heated up and begins to change,” explains Watkin. “Our study involves treatment strategies. Ultimately, we are looking at what is the best treatment we can provide while still maintaining muscle quality.”
The team, funded by the National Institutes of Health’s National Cancer Institute, uses ultrasound images of tongue tissue from patients at 10 cancer research centers nationwide. The data, which are transmitted electronically as image files to Watkin’s laboratory at Carle Foundation Hospital in Urbana, IL, include images from patients receiving different radiation doses. The images are ultrasounds taken at different times in patients’ treatments and up to a year after treatment has finished. By studying changes in tissue over time and comparing tissue changes in patients receiving different dosages of radiation, the researchers hope to develop treatment strategies for head and neck cancer patients that cause the least amount of damage to healthy tissue while still killing cancerous cells.
To analyze the ultrasound images and determine the textures of the tissues represented in each, the researchers developed a tissue analysis algorithm called the Gray Level Texture Parameter computation. To compute the textures, the code divides each image into kernels that are 8 x 8 pixels in size. A tool called the Spatial Gray Level Dependence (SGLD) matrix then computes a correlation number for each pixel within each 8-x-8 kernel of the image. This correlation number identifies whether the tissue is muscle, fat, or fibrous tissue. The SGLD correlation number for muscle ranges from 0 to .40, for fibrous tissue it ranges from .41 to .85, while the correlation number for fat is .85 or above. The end result is a color-coded image of the tongue tissue in which red represents muscle, yellow represents fat, and blue represents other tissue types, including fibrous tissue.
“The correlation number that the computer generates for each pixel identifies what [tissue] type is at that particular point in the image,” says Ibrahima Diouf, a postdoctoral fellow in speech and hearing science and a member of the research team. “Generating a correlation matrix for each pixel is a computationally intensive process. It could be done on a PC, but it was taking us a whole day to get a full analysis of one image. ”
When the team’s code was ported to the Origin2000, tissue analysis of the images made a giant leap in speed. NCSA’s Saarinen developed a parallel version of the Gray Level Texture Parameter computation algorithm, which allowed image analysis to be distributed to a number of Origin processors. According to Diouf, each ultrasound image is now subdivided into two, four, eight, or 16 segments. Each segment is then analyzed by an individual Origin processor, and the segments are recombined into one color coded image. So far, analysis of a single image has been done on up to 16 Origin processors, which cuts the compute time to about five minutes. In the months to come, the team plans to port its code to NCSA’s NT supercluster. According to Diouf, computing on the NT supercluster is a logical move for the research team since the lab’s computers run Windows NT.
Texture analysis algorithms and faster ways of analyzing medical images benefit not only researchers. In the long run, the combination of new analysis methods and supercomputing power could assist radiologists and pathologists, who are often the first medical professionals to identify cancerous tissue. These days, radiologists and pathologists deal with hundreds of diagnostic images a day taken from a wide range of imaging devices, including computerized tomography, magnetic resonance imaging, positron emission tomography, ultrasound, light and electron microscopy, and 3D imaging. Examining and sorting through all these images to identify specific types of cells, including cancer cells, is labor intensive and fatiguing, says Watkin. He hopes for a future in which high-speed networks or satellites transmit medical imaging data to a supercomputing system for processing. Processing would take only minutes, and results could be displayed remotely for the technicians at hospitals and clinics.
“This kind of image analysis methodology on a large scale would mean that professionals in the hospital setting would be able to quickly identify differences in tissue and know what areas of an image need to be looked at closely,” says Watkin.
This research is supported by the National Institutes of Health’s National Cancer Institute.