The Top Supercomputing Led Discoveries of 2013

By Nicole Hemsoth

January 2, 2014

2013 has been an incredible year for the entire ecosystem around supercomputing; from vendors pushing new technologies to boost performance, capacity, and programmability to researchers turning over new insights with fresh techniques. While exascale has taken more of a backseat than we might have predicted at the year’s end of 2010, there are plenty of signs that production HPC environments are blazing plenty of new trails.

As the calendar flips into 2014, we wanted to cast a backward glance at select discoveries and opportunities made possible by the fastest systems in the world and the people who run them—all pulled from our news archives of the past year along some important thematic lines.

We’ve pulled over 30 examples of how supercomputers are set to change the world in 2014 and beyond and while this list is anything but exhaustive, it does show how key segments in research and industry are evolving with HPC.

In a Galaxy Far, Far Away…

galaxyOne of the most famous “showcase” areas where HPC finds a mainstream shine is when news breaks of startling answers emerge to questions as big as “where do we come from” and “what is the universe made of.” As one might expect, 2013 was a banner year for discoveries that reached well beyond earth.

This year, Kraken at the National Institute for Computational Sciences (NICS) at the University of Tennessee Knoxville, addressed some large-scale, stubborn classical physics problems with revolutionary protoplanetary disk research while another massive system, the Opteron-powered “Hopper” Cray XE6 system that is part of the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Lab lit up the comos.

Back in October, a team of scientists from ETH Zurich and the University of Leeds solved a 300-year-old riddle about the nature of the Earth’s rotation. Using the Cray XE6 supercomputer “Monte Rosa” installed at CSCS, the researchers uncovered the reason for the gradual westward movement of the Earth’s magnetic field.

This past November, astrophysics researchers at UC San Diego advanced their understanding of star formation with the help of some major computational resources from San Diego Supercomputer Center (SDSC) at the UC San Diego and the National Institute for Computational Science (NICS) at Oak Ridge National Laboratory.

Researchers at the Universities of Leeds and Chicago harnessed supercomputing power to uncover an important mechanism behind the generation of astrophysical magnetic fields such as that of the sun. And researchers at the Institute for Computational Cosmology (ICC), are using HPC to model phenomena ranging from solar flares to the formation of galaxies. Others, including an NSF-supported team from the University of Iowa used 2013 and plenty of supercomputing might to measure space turbulence directly for the first time in the laboratory, allowing the world to finally see the dynamics behind it.

Some spent some of their year bolstering current resources to aid in new discoveries. For instance, a new 60 teraflops supercomputer and 1 petabyte high speed storage system recently installed on the campus of the University of California at Santa Cruz, which will give astrophysicists at the college the computational and storage headroom they need to model the heavens like never before.

Medical Discovery and Research

CONNECTOME3Ever-growing datasets fed by information from a widening pool of sources pushed medical research even further into supercomputing territory this year. The modeling and simulation requirements of viruses, genomic data, organ and

Supercomputing’s reach into the human brain was one of the most widely-cited research items in medical circles in 2013. This year the Human Brain Project was carried forward by a host of supporting institutions and systems and the topic was the subject of several lectures and keynotes that are worthy of reviewing before a fresh year begins.

Cancer research is another important field that is increasingly reliant on powerful systems. For instance, this year, researchers at Emory University reported a significant improvement in their ability to analyze and understand changes of cancer tumors over time thanks to HPC work done on a Keeneland Project supercomputer. Analysis of high resolution cancer tumor images that used to take weeks can now be completed in a matter of minutes on the hybrid GPU-CPU system.

Complex viruses, including HIV/AIDS were also the subject of a great deal of supercomputer-powered research this year. Researchers at the University of Illinois Urbana-Champaign have successfully modeled the interior of the HIV-1 virus using the Blue Waters system, opening the door to new antiretroviral drugs that target HIV-1, the virus that causes AIDS.

Other viruses, including malaria, were the target of additional innovative research. Pittsburgh Supercomputing Center (PSC) and the University of Notre Dame received up to $1.6 million in funding from the Bill & Melinda Gates Foundation to develop a system of computers and software for the Vector Ecology and Control Network (VECNet), an international consortium to eradicate malaria. The new VECNet Cyber-Infrastructure Project (CI) will support VECNet’s effort to unite research, industrial and public policy efforts to attack one of the worst diseases in the developing world in more effective, economical ways.

Armed with vaccines, however, viruses can be stopped in their tracks, assuming the delivery of such life-saving measures is done effectively. A supercomputer simulation of the West African nation of Niger showed that improving transportation as well could improve vaccine availability among children and mothers from roughly 50 percent to more than 90 percent.

On another front, researchers came a step closer to understanding strokes this year. A team from UC Berkeley and the University of California San Diego (UCSD) used the supercomputing resources of the National Energy Research Scientific Computing Center (NERSC) to model the efficacy of microbubbles and high intensity focused ultrasound (HIFU) for breaking up stroke-causing clots.

Other noteworthy advances powered by world-class systems emerged this year in medical areas as diverse as autism research and pushing new boundaries in medical physics. As ever-more computing capacity comes online in the coming year, we expect the diverse medical field to produce stunning stories and discoveries in 2014.

Climate Change and Earth Science

earfThe volumes of scientific support are growing in support of climate change, a process which has been powered by massive simulations, including those that put the changes in the context of global shifts over vast lengths of time.

For example, HPCwire’s Tiffany Trader wrote back in September on the “Climate Time Machine”, which relies on a database of extreme global weather events from 1871 to the present day, culled from newspaper weather reports, measurements on land and sea for the first decades along with more modern data. The team of top climate scientists fed the data into powerful supercomputers, including those at NERSC and the Oak Ridge Leadership Computing Facility in Tennessee, to create a virtual climate time machine. A sizable portion (12 percent) of the supercomputing resources at NERSC is allocated to global climate change research. That’s nearly 150 million processor-hours of highly-tuned computational might focused on an issue that is critical to humanity’s future.

New systems emerged to tackle climate change data. For instance, Berkeley Lab’s Green Flash is a specialized supercomputer designed to showcase a way to perform more detailed climate modeling. The system uses customized Tensilica-based processors, similar to those found in iPhones, and communication-minimizing algorithms that cut down on the movement of data, to model the movement of clouds around the earth at a higher resolution than was previously possible, without consuming huge amounts of electricity.

This year large-scale systems, like Blue Waters at NCSA, were used by a research team including Penn State engineers to enhance scientists’ understanding of global precipitation. The team used Blue Waters to tackle the problem of large gaps in precipitation data for large parts of the world. The goal is to help scientists and researchers move toward an integrated global water cycle observatory.

Other advances using new and enhanced data sources, including GIS, advanced satellite and emerging sensor technologies were made to aid research into other aspects of climate change. From an NCAR-led project to predict air pollution to others, the global climate change picture is filling in rapidly.

Manufacturing and Heavy Industry

planetcityManufacturing has been a notable target of investments on both the vendor and research fronts in 2013 as political rhetoric, changes in traditional manufacturing jobs, the need for STEM-based education to support a new manufacturing future and new technologies have all stepped up.

At the heart of industrial progress is a constant march toward more automation, efficiency and data-driven progress. As one might imagine, this offers significant opportunities for HPC modeling and simulation—not to mention for supercomputer-fed innovations in materials science, manufacturing processes and other areas.

Several facilities, including the Ohio Supercomputer Center, have lent helping hands to bring HPC to industry in 2013–and fresh efforts are springing up, including at Lawrence Livermore National Laboratory (LLNL). For instance, this year select industrial users had a crack at Vulcan, a 5 petafopper with 390,000 cores. With this, and a new host of commercial applications to tweak, LLNL is providing a much-needed slew of software and scaling support. The lab spent 2013 lining up participants to step to the high-core line to see how more compute horsepower can push modeling and simulation limits while solving specific scalability issues.

In July, companies interested in testing the latest in low-cost carbon fiber had a new opportunity to partner with the Department of Energy’s Carbon Fiber Technology Facility. The CFTF, operated by Oak Ridge National Laboratory as part of the Department’s Clean Energy Manufacturing Initiative, opened earlier this year to find ways to reduce carbon fiber production costs and to work with the private sector to stimulate widespread use of the strong, lightweight material.

Research was made into reality in a few interesting projects this year. For instance, a team of scientists and mathematicians at the DOE’s Lawrence Berkeley National Laboratory used their powerful number crunchers together with sophisticated algorithms to create cleaner combustion technologies to reduce the footprint of vehicles and machines. In another addition to cleaner manufacturing futures, scientists turned a lowly crustacean’s habits into a potentially beneficially process. The “Gribble” creature landed on the biofuel industry’s radar for its unique ability to digest wood in salty conditions. Now, researchers in the US and the UK are putting the University of Tennessee’s Kraken supercomputer to work modeling an enzyme in the Gribble’s gut, which could unlock the key to developing better industrial enzymes in the future.

Another notable story related to industry came from Oak Ridge National Lab, where researchers noted the importance of big rig trucks—a backbone to industry supply chains and product delivery. Most trucks only get about 6 miles to the gallon and altogether they emit about 423 million pounds of CO2 into the atmosphere each year. South Carolina-based BMI Corp. partnered with researchers at Oak Ridge National Laboratory (ORNL) to develop the SmartTruck UnderTray System, “a set of integrated aerodynamic fairings that improve the aerodynamics of 18-wheeler (Class 8) long-haul trucks.” After installation, the typical big rig can expect to achieve a fuel savings of between 7 and 12 percent, amounting to $5,000 annual savings in fuel costs.

CONTINUE with more of 2013’s progress… >>>

 

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Sandia to Take Delivery of World’s Largest Arm System

June 18, 2018

While the enterprise remains circumspect on prospects for Arm servers in the datacenter, the leadership HPC community is taking a bolder, brighter view of the x86 server CPU alternative. Amongst current and planned Arm HPC installations – i.e., the innovative Mont-Blanc project, led by Bull/Atos, the 'Isambard’ Cray XC50 going into the University of Bristol, and commitments from both Japan and France among others -- HPE is announcing that it will be supply the United States National Nuclear Security Administration (NNSA) with a 2.3 petaflops peak Arm-based system, named Astra. Read more…

By Tiffany Trader

Challenges Face Astroinformatics as It Sorts Through the Stars

June 15, 2018

You might have seen one of those YouTube videos: they begin on Earth, slowly zooming out to the Moon, the Solar System, the Milky Way, beyond – and suddenly, you’re looking at trillions of stars. It’s a lot to take Read more…

By Oliver Peckham

The Machine Learning Hype Cycle and HPC

June 14, 2018

Like many other HPC professionals I’m following the hype cycle around machine learning/deep learning with interest. I subscribe to the view that we’re probably approaching the ‘peak of inflated expectation’ but not quite yet starting the descent into the ‘trough of disillusionment. This still raises the probability that... Read more…

By Dairsie Latimer

HPE Extreme Performance Solutions

HPC and AI Convergence is Accelerating New Levels of Intelligence

Data analytics is the most valuable tool in the digital marketplace – so much so that organizations are employing high performance computing (HPC) capabilities to rapidly collect, share, and analyze endless streams of data. Read more…

IBM Accelerated Insights

Banks Boost Infrastructure to Tackle GDPR

As banks become more digital and data-driven, their IT managers are challenged with fast growing data volumes and lines-of-businesses’ (LoBs’) seemingly limitless appetite for analytics. Read more…

SDSC Researchers Use Machine Learning to More Accurately Model Water

June 13, 2018

Water – H2O – is a simple but fascinating (and useful) compound. San Diego Supercomputing Center researchers used machine learning techniques to develop models for simulations of water with “unprecedented accuracy. Read more…

By Staff

Sandia to Take Delivery of World’s Largest Arm System

June 18, 2018

While the enterprise remains circumspect on prospects for Arm servers in the datacenter, the leadership HPC community is taking a bolder, brighter view of the x86 server CPU alternative. Amongst current and planned Arm HPC installations – i.e., the innovative Mont-Blanc project, led by Bull/Atos, the 'Isambard’ Cray XC50 going into the University of Bristol, and commitments from both Japan and France among others -- HPE is announcing that it will be supply the United States National Nuclear Security Administration (NNSA) with a 2.3 petaflops peak Arm-based system, named Astra. Read more…

By Tiffany Trader

The Machine Learning Hype Cycle and HPC

June 14, 2018

Like many other HPC professionals I’m following the hype cycle around machine learning/deep learning with interest. I subscribe to the view that we’re probably approaching the ‘peak of inflated expectation’ but not quite yet starting the descent into the ‘trough of disillusionment. This still raises the probability that... Read more…

By Dairsie Latimer

Xiaoxiang Zhu Receives the 2018 PRACE Ada Lovelace Award for HPC

June 13, 2018

Xiaoxiang Zhu, who works for the German Aerospace Center (DLR) and Technical University of Munich (TUM), was awarded the 2018 PRACE Ada Lovelace Award for HPC for her outstanding contributions in the field of high performance computing (HPC) in Europe. Read more…

By Elizabeth Leake

U.S Considering Launch of National Quantum Initiative

June 11, 2018

Sometime this month the U.S. House Science Committee will introduce legislation to launch a 10-year National Quantum Initiative, according to a recent report by Read more…

By John Russell

ORNL Summit Supercomputer Is Officially Here

June 8, 2018

Oak Ridge National Laboratory (ORNL) together with IBM and Nvidia celebrated the official unveiling of the Department of Energy (DOE) Summit supercomputer toda Read more…

By Tiffany Trader

Exascale USA – Continuing to Move Forward

June 6, 2018

The end of May 2018, saw several important events that continue to advance the Department of Energy’s (DOE) Exascale Computing Initiative (ECI) for the United Read more…

By Alex R. Larzelere

Exascale for the Rest of Us: Exaflops Systems Capable for Industry

June 6, 2018

Enterprise advanced scale computing – or HPC in the enterprise – is an entity unto itself, situated between (and with characteristics of) conventional enter Read more…

By Doug Black

Fracas in Frankfurt: ISC18 Cluster Competition Teams Unveiled

June 6, 2018

The Student Cluster Competition season heats up with the seventh edition of the ISC Student Cluster Competition, slated to begin on June 25th in Frankfurt, Germ Read more…

By Dan Olds

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

ORNL Summit Supercomputer Is Officially Here

June 8, 2018

Oak Ridge National Laboratory (ORNL) together with IBM and Nvidia celebrated the official unveiling of the Department of Energy (DOE) Summit supercomputer toda Read more…

By Tiffany Trader

HPE Wins $57 Million DoD Supercomputing Contract

February 20, 2018

Hewlett Packard Enterprise (HPE) today revealed details of its massive $57 million HPC contract with the U.S. Department of Defense (DoD). The deal calls for HP Read more…

By Tiffany Trader

Leading Solution Providers

SC17 Booth Video Tours Playlist

Altair @ SC17

Altair

AMD @ SC17

AMD

ASRock Rack @ SC17

ASRock Rack

CEJN @ SC17

CEJN

DDN Storage @ SC17

DDN Storage

Huawei @ SC17

Huawei

IBM @ SC17

IBM

IBM Power Systems @ SC17

IBM Power Systems

Intel @ SC17

Intel

Lenovo @ SC17

Lenovo

Mellanox Technologies @ SC17

Mellanox Technologies

Microsoft @ SC17

Microsoft

Penguin Computing @ SC17

Penguin Computing

Pure Storage @ SC17

Pure Storage

Supericro @ SC17

Supericro

Tyan @ SC17

Tyan

Univa @ SC17

Univa

Hennessy & Patterson: A New Golden Age for Computer Architecture

April 17, 2018

On Monday June 4, 2018, 2017 A.M. Turing Award Winners John L. Hennessy and David A. Patterson will deliver the Turing Lecture at the 45th International Sympo Read more…

By Staff

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

Google I/O 2018: AI Everywhere; TPU 3.0 Delivers 100+ Petaflops but Requires Liquid Cooling

May 9, 2018

All things AI dominated discussion at yesterday’s opening of Google’s I/O 2018 developers meeting covering much of Google's near-term product roadmap. The e Read more…

By John Russell

Nvidia Ups Hardware Game with 16-GPU DGX-2 Server and 18-Port NVSwitch

March 27, 2018

Nvidia unveiled a raft of new products from its annual technology conference in San Jose today, and despite not offering up a new chip architecture, there were still a few surprises in store for HPC hardware aficionados. Read more…

By Tiffany Trader

Pattern Computer – Startup Claims Breakthrough in ‘Pattern Discovery’ Technology

May 23, 2018

If it weren’t for the heavy-hitter technology team behind start-up Pattern Computer, which emerged from stealth today in a live-streamed event from San Franci Read more…

By John Russell

Part One: Deep Dive into 2018 Trends in Life Sciences HPC

March 1, 2018

Life sciences is an interesting lens through which to see HPC. It is perhaps not an obvious choice, given life sciences’ relative newness as a heavy user of H Read more…

By John Russell

Intel Pledges First Commercial Nervana Product ‘Spring Crest’ in 2019

May 24, 2018

At its AI developer conference in San Francisco yesterday, Intel embraced a holistic approach to AI and showed off a broad AI portfolio that includes Xeon processors, Movidius technologies, FPGAs and Intel’s Nervana Neural Network Processors (NNPs), based on the technology it acquired in 2016. Read more…

By Tiffany Trader

Google Charts Two-Dimensional Quantum Course

April 26, 2018

Quantum error correction, essential for achieving universal fault-tolerant quantum computation, is one of the main challenges of the quantum computing field and it’s top of mind for Google’s John Martinis. At a presentation last week at the HPC User Forum in Tucson, Martinis, one of the world's foremost experts in quantum computing, emphasized... Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This