Visit additional Tabor Communication Publications
December 03, 2008
Interactive program that can search through DNA wins contest
RICHLAND, Wash., Dec. 3 -- DNA sequencing is easier than ever, but the amount of data to be analyzed is piling up. An award-winning computer program now shows that genome sequence analysis can be made interactive and intuitive, helping researchers find hidden relationships in massive amounts of data.
Researchers from the Department of Energy's Pacific Northwest National Laboratory captured "Best Overall" for their entry at the Supercomputing '08 High Performance Computing Analytics Challenge in Austin, Texas, on Nov. 20.
In the competition, scientists were judged on solving real world problems using comprehensive computational approaches, large data sets, and high-end visualization technology to display results -- which means it had to look good and be easy to use.
PNNL's Chris Oehmen led a multidisciplinary team composed of Scott Dowson, Chandrika Sivaramakrishnan, Justin Almquist, Lee Ann McCue, Bobbie-Jo Webb-Robertson, and Jason McDermott to the win. The team used resources at PNNL and at EMSL, DOE's Environmental Molecular Sciences Laboratory on the PNNL campus, to develop the interactive program.
Past finalists have been in areas as varied as orthodontics, atomic energy, and music classification. PNNL's winning entry in genomics combined multiple databases, analysis software, and a home-grown "visualization technology" called Starlight that presents data in unique visual patterns and allows users to interactively explore them.
"Our entire team is thrilled that we won," Oehmen said. "It's an honor to be a part of this international competition. We could not have completed the challenge without the support of our sponsors at the Department of Energy, the National Science Foundation, and internal investments from the Pacific Northwest National Laboratory."
A common problem for genomics researchers, said Oehmen, is that desktop computers often can't handle the volume of calculations needed to analyze many genomes at once. At the other end of the spectrum, high performance computers often limit researchers' ability to guide the analysis along the way.
"We wanted to demonstrate that high-performance computing can be integrated into an iterative workflow because this is the way biologists really work," Oehmen said. "It was the MeDICi middleware that really helped us pull the various data, analysis, and visualization together."
In genomic studies, computer programs compare DNA sequences of different living things to find shared proteins or uncover the function of a mystery protein, generating ideas that can then be tested in laboratory experiments. This interactive program gives laboratory researchers a place to start in looking for proteins and genes with interesting functions.
Oehmen demonstrated that their interactive program and high computing power could explore the complement of proteins found in an organism, allow them to focus on a protein that intrigued them, and investigate its possibilities.
Browsing through all the proteins in various Shewanella bacterial species, the team noticed more proteins than expected with tell-tale iron-detecting components.
"We thought, there are a lot of iron-sensing proteins here. What are they doing?" said Oehmen.
As it happens, many species of Shewanella have the ability to transfer electrons to an electrode, thus forming a simple biological fuel cell, an alternate means of generating energy. Iron is involved in this activity, so the team decided to identify proteins that may help the bacteria sense the iron and form a biofilm on an electrode.
Starting with a known, non-Shewanella protein that senses iron, the program allowed researchers to guide the search for similar proteins out of 42,000 proteins from 10 Shewanella species. After rounding up about 550 possible iron-sensing proteins, the researchers switched gears and determined which of these might also be involved in biofilm formation, based on other criteria. Ultimately, the team zoomed in on one protein that had potential roles in both activities. In addition, some of the species had two copies, suggesting those species might have some sort of biofilm advantage.
"Letting users find this sort of information interactively is the main motivation for this work and for the visual representations we have chosen," said Oehmen.
The visual representations of the data included colorful "graphical clusters" that looked like pie charts to the untrained eye, and other images that looked like stars connected through space. In the Shewanella example, the star-graph clued the researchers into the presence of the extra protein copies.
"Presenting data visually can let important information rise to the top," said Oehmen.
And there at the top, the secrets buried in DNA data just can't stay hidden.
A video demonstration of the program can be found here: http://www.pnl.gov/science/highlights/highlight.asp?id=516.
Software developed by PNNL that were used for the demonstration included SHOT, a sequence analysis algorithm that transforms protein sequences into sets of features to identify homologous pairs using a support vector machine; ScalaBLAST, an algorithm for performing gene and protein sequence analysis; Starlight, now commercially available from Menlo Park-based Future Point Systems, Inc, an information visualization application to perform advanced interactive visual analysis. Bringing them together was MeDICi, a "plumbing" software technology developed as part of PNNL's Data Intensive Computing Research Initiative that supports the integration of applications, data and computing resources.
EMSL, the Environmental Molecular Sciences Laboratory, is a national scientific user facility sponsored by the Department of Energy's Office of Science, Biological and Environmental Research program that is located at Pacific Northwest National Laboratory. EMSL offers an open, collaborative environment for scientific discovery to researchers around the world. EMSL's technical experts and suite of custom and advanced instruments are unmatched. Its integrated computational and experimental capabilities enable researchers to realize fundamental scientific insights and create new technologies.
Pacific Northwest National Laboratory is a Department of Energy Office of Science national laboratory where interdisciplinary teams advance science and technology and deliver solutions to America's most intractable problems in energy, national security and the environment. PNNL employs 4,200 staff and has an $850 million annual budget. Ohio-based Battelle has managed PNNL since the lab's inception in 1965.
Source: DOE/Pacific Northwest National Laboratory
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.