Comet Supercomputer Assists With Genomic Research

November 3, 2016

Nov. 3 — One of the most detailed genomic studies of any ecosystem to date has revealed an underground world of stunning microbial diversity, and added dozens of new branches to the tree of life.

The bacterial bonanza comes from scientists who reconstructed the genomes of more than 2,500 microbes from sediment and groundwater samples collected at an aquifer in Colorado. The effort was led by researchers from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley. DNA sequencing was performed at the Joint Genome Institute, a DOE Office of Science User Facility, and analyses were conducted with the aid of the CIPRES gateway and the Comet supercomputer, based at the San Diego Supercomputer Center (SDSC) at the University of California San Diego.

As reported online October 24 in the journal Nature Communications, the scientists netted genomes from 80 percent of all known bacterial phyla, a remarkable degree of biological diversity at one location. They also discovered 47 new phylum-level bacterial groups, naming many of them after influential microbiologists and other scientists. And they learned new insights about how microbial communities work together to drive processes that are critical to the planet’s climate and life everywhere, such as the carbon and nitrogen cycles.

These findings shed light on one of Earth’s most important and least understood realms of life. The subterranean world hosts up to one-fifth of all biomass, but it remains a mystery.

“We didn’t expect to find this incredible microbial diversity. But then again, we know little about the roles of subsurface microbes in biogeochemical processes, and more broadly, we don’t really know what’s down there,” says Jill Banfield, a Senior Faculty Scientist in Berkeley Lab’s Climate & Ecosystem Sciences Division and a UC Berkeley professor in the departments of Earth and Planetary Science, and Environmental Science, Policy, and Management.

Added UC Berkeley’s Karthik Anantharaman, the first author of the paper: “To better understand what subsurface microbes are up to, our approach is to access their entire genomes. This enabled us to discover a greater interdependency among microbes than we’ve seen before.”

The research is part of a Berkeley Lab-led project called Watershed Function Scientific Focus Area (formerly Sustainable Systems Scientific Focus Area 2.0). The project is developing a predictive understanding of terrestrial environments from the genome to the watershed scale. The field research takes place at a research site near the town of Rifle, Colorado, where for the past several years scientists have conducted experiments designed to stimulate populations of subterranean microbes that are naturally present in very low numbers.

The scientists sent soil and water samples from these experiments to the Joint Genome Institute for terabase-scale metagenomic sequencing. This high-throughput method isolates and purifies DNA from environmental samples, and then sequences one trillion base pairs of DNA at a time. Next, the scientists used bioinformatics tools developed in Banfield’s lab along with those from the CIPRES Science Gateway to analyze the data.

“The CIPRES Science Gateway and the Comet supercomputer were instrumental to our work,” Banfield said. “Considering the unprecedented size of our sequence datasets, we were unable to complete any runs for inferring trees on other servers.” The CIPRES Science Gateway and Comet are available through the Extreme Science and Engineering Discovery Environment (XSEDE). Supported by the National Science Foundation, XSEDE is the most advanced, powerful, and robust collection of integrated advanced digital resources and services in the world.

The scientists’ approach has redrawn the tree of life. Between the 47 new bacterial groups reported in this work, and 35 new groups published last year (also found at the Rifle site), Banfield’s team has doubled the number of known bacterial groups.

With discovery comes naming rights. The scientists named many of the new bacteria groups after Berkeley Lab and UC Berkeley researchers. For example, there’s Candidatus Andersenbacteria, after phylochip inventor Gary Andersen, and there’s Candidatus Doudnabacteria, after CRISPR genome-editing pioneer Jennifer Doudna.

“Berkeley now dominates the tree of life as it does the periodic table,” Banfield says, in a nod to the sixteen elements discovered at Berkeley Lab and UC Berkeley.

Another big outcome is a deeper understanding of the roles subsurface microbes play in globally important carbon, hydrogen, nitrogen, and sulfur cycles. This information will help to better represent these cycles in predictive models such as climate simulations.

The scientists conducted metabolic analyses of 36 percent of the organisms detected in the aquifer system. They focused on a phenomenon called metabolic handoff, which essentially means one microbe’s waste is another microbe’s food. It’s known from lab studies that handoffs are needed in certain reactions, but these interconnected networks are widespread and vastly more complex in the real world.

To understand why it’s important to represent metabolic handoffs as accurately as possible in models, consider nitrate, a groundwater contaminant from fertilizers. Subsurface microbes are the primary driver in reducing nitrate to harmless nitrogen gas. There are four steps in this denitrification process, and the third step creates nitrous oxide—one of the most potent greenhouse gases. The process breaks down if microbes that carry out the fourth step are inactive when a pulse of nitrate enters the system.

“If microbes aren’t there to accept the nitrous oxide handoff, then the greenhouse gas escapes into the atmosphere,” says Anantharaman.

The scientists found the carbon, hydrogen, nitrogen, and sulfur cycles are all driven by metabolic handoffs that require an unexpectedly high degree of interdependence among microbes. The vast majority of microorganisms can’t fully reduce a compound on their own. It takes a team. There are also backup microbes ready to perform a handoff if first-string microbes are unavailable.

“The combination of high microbial diversity and interconnections through metabolic handoffs likely results in high ecosystem resilience,” says Banfield.

Other co-authors of the paper include Berkeley Lab’s Eoin Brodie, Susan Hubbard, Ulas Karaoz, and Kenneth Williams; and UC Berkeley’s Chris Brown, Cindy Castelle, Laura Hug, Alexander Probst, Itai Sharon, Andrea Singh, and Brian Thomas. The research is supported by the Department of Energy’s Office of Science.

About SDSC

As an Organized Research Unit of UC San Diego, SDSC is considered a leader in data-intensive computing and cyberinfrastructure, providing resources, services, and expertise to the national research community, including industry and academia. Cyberinfrastructure refers to an accessible, integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. SDSC’s Comet joins the Center’s data-intensive Gordon cluster, and are both part of the National Science Foundation’s XSEDE (Extreme Science and Engineering Discovery Environment) program.

About LBNL

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.


Source: SDSC

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penalties to HPC applications. Even as these patches are rolled o Read more…

By Pete Beckman

Intel Touts Silicon Spin Qubits for Quantum Computing

February 14, 2018

Debate around what makes a good qubit and how best to manufacture them is a sprawling topic. There are many insistent voices favoring one or another approach. Referencing a paper published today in Nature, Intel has offe Read more…

By John Russell

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

HPE Extreme Performance Solutions

Safeguard Your HPC Environment with the World’s Most Secure Industry Standard Servers

Today’s organizations operate in an environment with ever-evolving threats, and in order to protect themselves they must continuously bolster their security strategy. Hewlett Packard Enterprise (HPE) and Intel® are addressing modern security challenges with the world’s most secure industry standard servers powered by the latest generation of Intel® Xeon® Scalable processors. Read more…

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended to make it easier, faster and cheaper to train and run machi Read more…

By Doug Black

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penal Read more…

By Pete Beckman

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

The Food Industry’s Next Journey — from Mars to Exascale

February 12, 2018

Global food producer and one of the world's leading chocolate companies Mars Inc. has a unique perspective on the impact that exascale computing will have on the food industry. Read more…

By Scott Gibson, Oak Ridge National Laboratory

Singularity HPC Container Start-Up – Sylabs – Emerges from Stealth

February 8, 2018

The driving force behind Singularity, the popular HPC container technology, is bringing the open source platform to the enterprise with the launch of a new vent Read more…

By George Leopold

Dell EMC Debuts PowerEdge Servers with AMD EPYC Chips

February 6, 2018

AMD notched another EPYC processor win today with Dell EMC’s introduction of three PowerEdge servers (R6415, R7415, and R7425) based on the EPYC 7000-series p Read more…

By John Russell

‘Next Generation’ Universe Simulation Is Most Advanced Yet

February 5, 2018

The research group that gave us the most detailed time-lapse simulation of the universe’s evolution in 2014, spanning 13.8 billion years of cosmic evolution, is back in the spotlight with an even more advanced cosmological model that is providing new insights into how black holes influence the distribution of dark matter, how heavy elements are produced and distributed, and where magnetic fields originate. Read more…

By Tiffany Trader

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

Leading Solution Providers

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

V100 Good but not Great on Select Deep Learning Aps, Says Xcelerit

November 27, 2017

Wringing optimum performance from hardware to accelerate deep learning applications is a challenge that often depends on the specific application in use. A benc Read more…

By John Russell

SC17: Singularity Preps Version 3.0, Nears 1M Containers Served Daily

November 1, 2017

Just a few months ago about half a million jobs were being run daily using Singularity containers, the LBNL-founded container platform intended for HPC. That wa Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Share This