Visit additional Tabor Communication Publications
December 03, 2008
Interactive program that can search through DNA wins contest
RICHLAND, Wash., Dec. 3 -- DNA sequencing is easier than ever, but the amount of data to be analyzed is piling up. An award-winning computer program now shows that genome sequence analysis can be made interactive and intuitive, helping researchers find hidden relationships in massive amounts of data.
Researchers from the Department of Energy's Pacific Northwest National Laboratory captured "Best Overall" for their entry at the Supercomputing '08 High Performance Computing Analytics Challenge in Austin, Texas, on Nov. 20.
In the competition, scientists were judged on solving real world problems using comprehensive computational approaches, large data sets, and high-end visualization technology to display results -- which means it had to look good and be easy to use.
PNNL's Chris Oehmen led a multidisciplinary team composed of Scott Dowson, Chandrika Sivaramakrishnan, Justin Almquist, Lee Ann McCue, Bobbie-Jo Webb-Robertson, and Jason McDermott to the win. The team used resources at PNNL and at EMSL, DOE's Environmental Molecular Sciences Laboratory on the PNNL campus, to develop the interactive program.
Past finalists have been in areas as varied as orthodontics, atomic energy, and music classification. PNNL's winning entry in genomics combined multiple databases, analysis software, and a home-grown "visualization technology" called Starlight that presents data in unique visual patterns and allows users to interactively explore them.
"Our entire team is thrilled that we won," Oehmen said. "It's an honor to be a part of this international competition. We could not have completed the challenge without the support of our sponsors at the Department of Energy, the National Science Foundation, and internal investments from the Pacific Northwest National Laboratory."
A common problem for genomics researchers, said Oehmen, is that desktop computers often can't handle the volume of calculations needed to analyze many genomes at once. At the other end of the spectrum, high performance computers often limit researchers' ability to guide the analysis along the way.
"We wanted to demonstrate that high-performance computing can be integrated into an iterative workflow because this is the way biologists really work," Oehmen said. "It was the MeDICi middleware that really helped us pull the various data, analysis, and visualization together."
In genomic studies, computer programs compare DNA sequences of different living things to find shared proteins or uncover the function of a mystery protein, generating ideas that can then be tested in laboratory experiments. This interactive program gives laboratory researchers a place to start in looking for proteins and genes with interesting functions.
Oehmen demonstrated that their interactive program and high computing power could explore the complement of proteins found in an organism, allow them to focus on a protein that intrigued them, and investigate its possibilities.
Browsing through all the proteins in various Shewanella bacterial species, the team noticed more proteins than expected with tell-tale iron-detecting components.
"We thought, there are a lot of iron-sensing proteins here. What are they doing?" said Oehmen.
As it happens, many species of Shewanella have the ability to transfer electrons to an electrode, thus forming a simple biological fuel cell, an alternate means of generating energy. Iron is involved in this activity, so the team decided to identify proteins that may help the bacteria sense the iron and form a biofilm on an electrode.
Starting with a known, non-Shewanella protein that senses iron, the program allowed researchers to guide the search for similar proteins out of 42,000 proteins from 10 Shewanella species. After rounding up about 550 possible iron-sensing proteins, the researchers switched gears and determined which of these might also be involved in biofilm formation, based on other criteria. Ultimately, the team zoomed in on one protein that had potential roles in both activities. In addition, some of the species had two copies, suggesting those species might have some sort of biofilm advantage.
"Letting users find this sort of information interactively is the main motivation for this work and for the visual representations we have chosen," said Oehmen.
The visual representations of the data included colorful "graphical clusters" that looked like pie charts to the untrained eye, and other images that looked like stars connected through space. In the Shewanella example, the star-graph clued the researchers into the presence of the extra protein copies.
"Presenting data visually can let important information rise to the top," said Oehmen.
And there at the top, the secrets buried in DNA data just can't stay hidden.
A video demonstration of the program can be found here: http://www.pnl.gov/science/highlights/highlight.asp?id=516.
Software developed by PNNL that were used for the demonstration included SHOT, a sequence analysis algorithm that transforms protein sequences into sets of features to identify homologous pairs using a support vector machine; ScalaBLAST, an algorithm for performing gene and protein sequence analysis; Starlight, now commercially available from Menlo Park-based Future Point Systems, Inc, an information visualization application to perform advanced interactive visual analysis. Bringing them together was MeDICi, a "plumbing" software technology developed as part of PNNL's Data Intensive Computing Research Initiative that supports the integration of applications, data and computing resources.
EMSL, the Environmental Molecular Sciences Laboratory, is a national scientific user facility sponsored by the Department of Energy's Office of Science, Biological and Environmental Research program that is located at Pacific Northwest National Laboratory. EMSL offers an open, collaborative environment for scientific discovery to researchers around the world. EMSL's technical experts and suite of custom and advanced instruments are unmatched. Its integrated computational and experimental capabilities enable researchers to realize fundamental scientific insights and create new technologies.
Pacific Northwest National Laboratory is a Department of Energy Office of Science national laboratory where interdisciplinary teams advance science and technology and deliver solutions to America's most intractable problems in energy, national security and the environment. PNNL employs 4,200 staff and has an $850 million annual budget. Ohio-based Battelle has managed PNNL since the lab's inception in 1965.
Source: DOE/Pacific Northwest National Laboratory
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
Jun 13, 2013 |
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.