Visit additional Tabor Communication Publications
December 08, 2006
It turns out that sequencing the human genome -- determining the order of DNA building blocks -- has not completely cracked the code of how DNA directs various cellular processes. In addition to the sequence of the base pairs, the instructions are in the packaging -- how DNA is folded within a cell.
Virginia Tech researchers used novel methodology and the university's System X supercomputer to carry out what is probably the first simulation that explores full range of motions of a DNA strand of 147 base pairs, the length that is required to form the fundamental unit of DNA packing in the living cells -- the nucleosome. Contrary to a long-held belief that DNA is hard to bend, the simulation shows in crisp atomic detail that DNA is considerably more flexible than commonly thought.
The research is published in the December issue of the Biophysical Journal, in the article "A Computational Study of Nucleosomal DNA Flexibility," by Jory Zmuda Ruscio of Leesburg, Va., a Ph.D. student in the Genetics, Bioinformatics and Computational Biology Program at Virginia Tech, and Alexey Onufriev of Blacksburg, assistant professor of computer sciences and physics at Virginia Tech. They have been invited to do a platform presentation at the 51st Biophysical Society Annual Meeting in Baltimore in March.
There is about 12 feet of DNA in a human cell but it is packaged into nucleosomes -- lengths of 147 base pairs each wrapped around eight special proteins. A nucleosome looks kind of like the lumpy beginning of a rubber-band ball. Or maybe more like a lumpy worm coil. Uncoiled, the worm wiggles, flexes, and even kinks, according to a simulation performed on System X.
As we know from watching forensic detective shows on TV, the DNA in all of an individual's cells is identical. The DNA in fingernail cells is exactly the same as in muscle. Yet the cells are different. "This is because, roughly speaking, the DNA in different cell types is packed differently and the complexes it forms with the surrounding proteins are in different positions, so only the relevant part of the code can be read at a time," said Onufriev. "Although nobody knows exactly how it happens, you can imagine reading only what you can see on a part of a crumpled newspaper."
The traditional view is that DNA is relatively rigid and that considerable energy is required when it needs to be bent to form protein-DNA complexes. However, recent experiments (Nature, Aug. 17, 2006) have begun to challenge that view. "The famous double-helix may be much more flexible than previously thought," said Onufriev.
The Virginia Tech research responded to this debate. Using 128 of System X's 1,100 processors, the research resulted in a System X movie revealing DNA wiggling like a worm, showing greater flexibility than expected from the traditional view. The DNA packing in the nucleosome is also found to be surprisingly loose. "The implication is that it may not cost much energy to bend the DNA -- even to bend sharply," said Onufriev.
The methodology that is making it possible is based on the so-called "implicit solvent" approach being developed by Onufriev. "Biology does not happen in a vacuum," he said. "We are 75 percent water, and the effect of the water environment must be taken into account when studying biomolecules."
Previous simulations were often slowed because they accounted for the water that is present in living systems. For instance, in early studies of protein folding, only a few percent of the computing effort was being spent on the activity of the protein while the rest accounted for the activity of the surrounding fluids. The "implicit solvent" approach accounts for the role of water on average, but the movements of individual water molecules are not predicted, freeing computation capacity for simulation of whatever protein is being studied.
"Experiment cannot always probe atomic detail of living molecules because they are too small and often move too fast, said Onufriev. "But we can combine computational power with good algorithms to simulate these motions at high (atom-scale) resolution.
"It is an exciting time to do molecular modeling," he said. "The computing power and the methodology have come to the point that we can begin to fully probe biology on timescales very relevant to living things -- such as DNA packing."
Virginia Tech's System X supercomputer was critical to this research, he said. "It was the combination of its sheer compute power with the algorithmic advantages that made it possible to run molecular simulations on that scale."
So far, the Virginia Tech research team addressed the question of how flexible the DNA is, which is only a small piece of the "second part of the genetic code" puzzle, Onufriev said. "However, this small piece should pave the way to addressing bigger questions, such as 'Exactly how is the tightly packed genetic content read by cellular machines'."
"Atomic level simulations can complement experimentation and narrow competing theories," said Onufriev. "For systems as large as the nucleosome, simulations using virtual water may be the only practical way to estimate the stability of various confirmations," he said.
How DNA bends and flexes is critical for many cellular processes including cell differentiation and DNA replication. Although also observed in recent experiments, this unusual DNA flexibility is still unexplained. "Now seeing that DNA is not as hard to bend may lead to radical changes in our perspective," said Onufriev. "We are using these detailed pictures to see exactly how DNA bends and to understand the details of the mechanism behind it, something that is very hard or impossible to do experimentally."
Onufriev and his group of biochemistry, physics, biology, and other computer science researchers received a $1.1 million grant from the National Institutes of Health to develop high performance computing methodology to create molecular models and to probe in atomic detail the mechanisms of biology.
The purpose of the NIH award is to develop the methodology for computer simulations of complex biological processes and address the question of the atomic mechanism of DNA flexibility, Onufriev said. "This research may not only provide fundamental insights into how life works at the molecular level, but also has applications in drug discovery and in particular for rational drug design, which is an important consideration for the NIH."
Source: Virginia Tech
Jun 19, 2013 |
Supercomputer architectures have evolved considerably over the last 20 years, particularly in the number of processors that are linked together. One aspect of HPC architecture that hasn't changed is the MPI programming model.
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.