2013 has been an incredible year for the entire ecosystem around supercomputing; from vendors pushing new technologies to boost performance, capacity, and programmability to researchers turning over new insights with fresh techniques. While exascale has taken more of a backseat than we might have predicted at the year’s end of 2010, there are plenty of signs that production HPC environments are blazing plenty of new trails.
As the calendar flips into 2014, we wanted to cast a backward glance at select discoveries and opportunities made possible by the fastest systems in the world and the people who run them—all pulled from our news archives of the past year along some important thematic lines.
We’ve pulled over 30 examples of how supercomputers are set to change the world in 2014 and beyond and while this list is anything but exhaustive, it does show how key segments in research and industry are evolving with HPC.
In a Galaxy Far, Far Away…
One of the most famous “showcase” areas where HPC finds a mainstream shine is when news breaks of startling answers emerge to questions as big as “where do we come from” and “what is the universe made of.” As one might expect, 2013 was a banner year for discoveries that reached well beyond earth.
This year, Kraken at the National Institute for Computational Sciences (NICS) at the University of Tennessee Knoxville, addressed some large-scale, stubborn classical physics problems with revolutionary protoplanetary disk research while another massive system, the Opteron-powered “Hopper” Cray XE6 system that is part of the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Lab lit up the comos.
Back in October, a team of scientists from ETH Zurich and the University of Leeds solved a 300-year-old riddle about the nature of the Earth’s rotation. Using the Cray XE6 supercomputer “Monte Rosa” installed at CSCS, the researchers uncovered the reason for the gradual westward movement of the Earth’s magnetic field.
This past November, astrophysics researchers at UC San Diego advanced their understanding of star formation with the help of some major computational resources from San Diego Supercomputer Center (SDSC) at the UC San Diego and the National Institute for Computational Science (NICS) at Oak Ridge National Laboratory.
Researchers at the Universities of Leeds and Chicago harnessed supercomputing power to uncover an important mechanism behind the generation of astrophysical magnetic fields such as that of the sun. And researchers at the Institute for Computational Cosmology (ICC), are using HPC to model phenomena ranging from solar flares to the formation of galaxies. Others, including an NSF-supported team from the University of Iowa used 2013 and plenty of supercomputing might to measure space turbulence directly for the first time in the laboratory, allowing the world to finally see the dynamics behind it.
Some spent some of their year bolstering current resources to aid in new discoveries. For instance, a new 60 teraflops supercomputer and 1 petabyte high speed storage system recently installed on the campus of the University of California at Santa Cruz, which will give astrophysicists at the college the computational and storage headroom they need to model the heavens like never before.
Medical Discovery and Research
Ever-growing datasets fed by information from a widening pool of sources pushed medical research even further into supercomputing territory this year. The modeling and simulation requirements of viruses, genomic data, organ and
Supercomputing’s reach into the human brain was one of the most widely-cited research items in medical circles in 2013. This year the Human Brain Project was carried forward by a host of supporting institutions and systems and the topic was the subject of several lectures and keynotes that are worthy of reviewing before a fresh year begins.
Cancer research is another important field that is increasingly reliant on powerful systems. For instance, this year, researchers at Emory University reported a significant improvement in their ability to analyze and understand changes of cancer tumors over time thanks to HPC work done on a Keeneland Project supercomputer. Analysis of high resolution cancer tumor images that used to take weeks can now be completed in a matter of minutes on the hybrid GPU-CPU system.
Complex viruses, including HIV/AIDS were also the subject of a great deal of supercomputer-powered research this year. Researchers at the University of Illinois Urbana-Champaign have successfully modeled the interior of the HIV-1 virus using the Blue Waters system, opening the door to new antiretroviral drugs that target HIV-1, the virus that causes AIDS.
Other viruses, including malaria, were the target of additional innovative research. Pittsburgh Supercomputing Center (PSC) and the University of Notre Dame received up to $1.6 million in funding from the Bill & Melinda Gates Foundation to develop a system of computers and software for the Vector Ecology and Control Network (VECNet), an international consortium to eradicate malaria. The new VECNet Cyber-Infrastructure Project (CI) will support VECNet’s effort to unite research, industrial and public policy efforts to attack one of the worst diseases in the developing world in more effective, economical ways.
Armed with vaccines, however, viruses can be stopped in their tracks, assuming the delivery of such life-saving measures is done effectively. A supercomputer simulation of the West African nation of Niger showed that improving transportation as well could improve vaccine availability among children and mothers from roughly 50 percent to more than 90 percent.
On another front, researchers came a step closer to understanding strokes this year. A team from UC Berkeley and the University of California San Diego (UCSD) used the supercomputing resources of the National Energy Research Scientific Computing Center (NERSC) to model the efficacy of microbubbles and high intensity focused ultrasound (HIFU) for breaking up stroke-causing clots.
Other noteworthy advances powered by world-class systems emerged this year in medical areas as diverse as autism research and pushing new boundaries in medical physics. As ever-more computing capacity comes online in the coming year, we expect the diverse medical field to produce stunning stories and discoveries in 2014.
Climate Change and Earth Science
The volumes of scientific support are growing in support of climate change, a process which has been powered by massive simulations, including those that put the changes in the context of global shifts over vast lengths of time.
For example, HPCwire’s Tiffany Trader wrote back in September on the “Climate Time Machine”, which relies on a database of extreme global weather events from 1871 to the present day, culled from newspaper weather reports, measurements on land and sea for the first decades along with more modern data. The team of top climate scientists fed the data into powerful supercomputers, including those at NERSC and the Oak Ridge Leadership Computing Facility in Tennessee, to create a virtual climate time machine. A sizable portion (12 percent) of the supercomputing resources at NERSC is allocated to global climate change research. That’s nearly 150 million processor-hours of highly-tuned computational might focused on an issue that is critical to humanity’s future.
New systems emerged to tackle climate change data. For instance, Berkeley Lab’s Green Flash is a specialized supercomputer designed to showcase a way to perform more detailed climate modeling. The system uses customized Tensilica-based processors, similar to those found in iPhones, and communication-minimizing algorithms that cut down on the movement of data, to model the movement of clouds around the earth at a higher resolution than was previously possible, without consuming huge amounts of electricity.
This year large-scale systems, like Blue Waters at NCSA, were used by a research team including Penn State engineers to enhance scientists’ understanding of global precipitation. The team used Blue Waters to tackle the problem of large gaps in precipitation data for large parts of the world. The goal is to help scientists and researchers move toward an integrated global water cycle observatory.
Other advances using new and enhanced data sources, including GIS, advanced satellite and emerging sensor technologies were made to aid research into other aspects of climate change. From an NCAR-led project to predict air pollution to others, the global climate change picture is filling in rapidly.
Manufacturing and Heavy Industry
Manufacturing has been a notable target of investments on both the vendor and research fronts in 2013 as political rhetoric, changes in traditional manufacturing jobs, the need for STEM-based education to support a new manufacturing future and new technologies have all stepped up.
At the heart of industrial progress is a constant march toward more automation, efficiency and data-driven progress. As one might imagine, this offers significant opportunities for HPC modeling and simulation—not to mention for supercomputer-fed innovations in materials science, manufacturing processes and other areas.
Several facilities, including the Ohio Supercomputer Center, have lent helping hands to bring HPC to industry in 2013–and fresh efforts are springing up, including at Lawrence Livermore National Laboratory (LLNL). For instance, this year select industrial users had a crack at Vulcan, a 5 petafopper with 390,000 cores. With this, and a new host of commercial applications to tweak, LLNL is providing a much-needed slew of software and scaling support. The lab spent 2013 lining up participants to step to the high-core line to see how more compute horsepower can push modeling and simulation limits while solving specific scalability issues.
In July, companies interested in testing the latest in low-cost carbon fiber had a new opportunity to partner with the Department of Energy’s Carbon Fiber Technology Facility. The CFTF, operated by Oak Ridge National Laboratory as part of the Department’s Clean Energy Manufacturing Initiative, opened earlier this year to find ways to reduce carbon fiber production costs and to work with the private sector to stimulate widespread use of the strong, lightweight material.
Research was made into reality in a few interesting projects this year. For instance, a team of scientists and mathematicians at the DOE’s Lawrence Berkeley National Laboratory used their powerful number crunchers together with sophisticated algorithms to create cleaner combustion technologies to reduce the footprint of vehicles and machines. In another addition to cleaner manufacturing futures, scientists turned a lowly crustacean’s habits into a potentially beneficially process. The “Gribble” creature landed on the biofuel industry’s radar for its unique ability to digest wood in salty conditions. Now, researchers in the US and the UK are putting the University of Tennessee’s Kraken supercomputer to work modeling an enzyme in the Gribble’s gut, which could unlock the key to developing better industrial enzymes in the future.
Another notable story related to industry came from Oak Ridge National Lab, where researchers noted the importance of big rig trucks—a backbone to industry supply chains and product delivery. Most trucks only get about 6 miles to the gallon and altogether they emit about 423 million pounds of CO2 into the atmosphere each year. South Carolina-based BMI Corp. partnered with researchers at Oak Ridge National Laboratory (ORNL) to develop the SmartTruck UnderTray System, “a set of integrated aerodynamic fairings that improve the aerodynamics of 18-wheeler (Class 8) long-haul trucks.” After installation, the typical big rig can expect to achieve a fuel savings of between 7 and 12 percent, amounting to $5,000 annual savings in fuel costs.