Nine UW Projects Awarded Summer Use of Supercomputer in Cheyenne

June 29, 2016

June 29 — Nine projects, a number of which have applications to atmospheric science issues, were recently chosen to receive computational time and storage space on the supercomputer in Cheyenne.

University of Wyoming faculty members and, in one case, a graduate student, will head projects that will use the NCAR-Wyoming Supercomputing Center (NWSC). Each project was critically reviewed by an external panel of experts and evaluated on the experimental design, computational effectiveness, efficiency of resource use, and broader impacts such as how the project involves both UW and NCAR researchers; strengthens UW’s research capacity; enhances UW’s computational programs; or involves research in a new or emerging field.

“The Wyoming-NCAR Allocations Panel evaluated a record-high nine requests,” says Bryan Shader, UW’s special assistant to the vice president for research and economic development, and professor of mathematics. “The projects were granted allocations totaling 42.6 million core hours of computing time on Yellowstone and will enable some incredible science on issues of importance to Wyoming, the U.S. and the world. Given that Wyoming’s share of the NWSC is 75 million core hours, these allocations and the more than 40 million (core hours) allocated in February show more than full utilization of the resource.”

Twenty-five UW-led projects used Yellowstone (the nickname for the supercomputer) in 2015, and this places Wyoming as the top university in total allocations, users and usage among the more than 150 universities that use the NWSC.

Since the supercomputer came on line during October 2012, allocations have been made to 65 UW research projects, including these latest nine, which commence July 1.

The newest projects, with a brief description and principal investigators, are as follows:

Maohang Fan, a UW professor of petroleum engineering, heads a project, titled, “Application of Density Functional Theory in CO2 Capture and Conversion Research.” The project, partially funded by the Department of Energy, seeks to design promising catalysts for capturing and converting carbon dioxide. Collaborators include Wenyong Wang, a UW professor of physics and astronomy; Ted Russell, the Howard T. Tellepsen Chair and Regents’ Professor in the School of Civil and Environmental Engineering at Georgia Tech; and Hongtao Yu, professor and chair in the Department of Chemistry and Biochemistry at Jackson State University.

Bart Geerts, a UW professor of atmospheric science, heads the project, titled “Regional Climate Change Assessment in the Interior Western USA Using a Dynamical Downscaling Method with CCSM Bias Corrections: Focus on Precipitation and Snowpack.” The project focuses on better understanding how the distribution of precipitation, snowpack and stream-flow in the headwaters region of Wyoming are expected to change over the next 30-40 years. A better understanding of long-term changes in Wyoming  watersheds is of great interest to the state’s water obligations and water development opportunities, as well as to agricultural and forestry interests in the state, and to downstream stakeholders.

Collaborators include UW postdoctoral student Yonggang Wang, UW doctoral student Xiaoqing Jing and Changhai Liu, a scientist from NCAR’s Research Applications Laboratory. The project is partially supported by the Wyoming Water Development Commission.

Zachary Lebo, a UW assistant professor of atmospheric science, leads a project, titled “Investigating Forecast Performance in Wyoming Using a High-Resolution Numerical Weather Prediction Model.” Lebo is interested in better understanding factors that result in forecast errors for weather across Wyoming and, in using this understanding, to create better prediction tools for ground blizzards. His project will lay the groundwork for a real-time Wyoming forecasting operation, and aspects of the project and modeling will be incorporated into UW’s “Introduction to Atmospheric Science” undergraduate course.

Xiahong Liu, a UW professor of atmospheric science and the Wyoming Excellence Chair in Climate, will lead two projects. The first, titled “Quantifying the Impacts of Absorbing Aerosols on Rocky Mountain Regional Climate,” seeks to better understand the impacts on regional climate from the presence of light-absorbing aerosols, such as dust and particles from fires or pollution on top of snow.

Collaborators include Louisa Emmons, Simone Tilmes, Andrew Gettelman and Mary Barth from the National Center for Atmospheric Research (NCAR); and Chun Zhao, Yun Qian and Ruby Leun from the Pacific Northwest National Laboratory. The project is partially funded by the College of Engineering and Applied Science’s Tier-1 Engineering Initiative.

The second project, titled, “Modeling the Impacts of Biomass Burning Aerosols on Marine Stratocumulus Clouds Using a Hierarchical Modeling System,” will study the effects of particulates from wildfires on cloud formation.

Collaborators include NCAR’s Emmons, Tilmes, Barth and Gettelman; Yuhang Wang, professor in the Department of Earth and Atmospheric Sciences at the Georgia Institute of Technology; and Yun Qian, of Pacific Northwest National Laboratory. The project is partially supported by a National Science Foundation (NSF)/Department of Energy grant.

Subhashis Mallick, a UW geology and geophysics professor, will lead the project, titled “Anistropic Reverse-Time Mitigation and Full-Wave Form Inversion of Single and Multicomponent Seismic Data and Joint Inversion Single Component Seismic and Electromagnetic Data.” The project will develop the key analytic tools needed to use seismic studies to determine the storage capacity, optimum resource recovery, and other qualities of subsurface reservoirs as carbon dioxide storage and sequestration sites.

Scott Miller, a UW professor in the Department of Ecosystem Science and Management, heads a project, titled “Integrating Dynamically Downscaled Climate Data with Hydrologic Models.” The project will couple atmospheric and hydrologic models to study the impacts on water resources and flow regimes in the Crow Creek watershed in southeast Wyoming under different climate scenarios. This is one of the main watersheds providing water to Cheyenne. The project is supported by an NSF grant, called Water in a Changing West.

Fred Ogden, a UW professor in the Department of Civil and Architectural Engineering, leads a project, titled “ADHydro Model Development,” that will further develop and test a large-scale hydrological model that incorporates the groundwater-surface water interactions that are of importance in management of reservoirs, diversions, etc. Both National Oceanic and Atmospheric Administration and the U.S. National Water Center are considering incorporating Ogden’s ADHydro model into their models.

Wei Wang, a UW graduate student majoring in geology and geophysics, will undertake a project, titled “Near-Surface Adjoint Tomography Based on the Discontinuous Galerkin Method.” The goal of his study is to image and study a portion of the Earth’s critical zone, or the portion of the Earth between bedrock and treetops. In particular, Wang will use near-surface seismic data to understand how rocks and soil weather. Research will focus on a site near Blair-Wallis in southeastern Wyoming. The project is partially supported by the NSF grant Water in a Changing West.

By the numbers

The most recent recommended allocations total 42.6 million core hours, 270 terabytes of archival storage, and 47,000 hours on data analysis and visualization systems, Shader says. To provide some perspective on what these numbers mean, here are some useful comparisons. In simplest terms, Yellowstone can be thought of as 72,576 personal computers that are cleverly interconnected to perform as one computer. The computational time allocated is equivalent to the use of the entire supercomputer for 24.5 days­, 24 hours a day. The 270 terabytes of storage would be enough to store the entire printed collection of the U.S. Library of Congress more than 20 times.

Yellowstone consists of about 70,000 processors, also known as cores. An allocation of one core hour allows a project to run one of these processors for one hour, or 1,000 of these for 1/1,000th of an hour.

The successor to the Yellowstone cluster, to be called Cheyenne, is scheduled to come online in early 2017. It is anticipated that Yellowstone will be retired in late 2017. In fall 2017, Wyoming researchers will have an opportunity to apply for early opportunities to use Cheyenne for ambitious projects that utilize Cheyenne’s increased capabilities.

In late 2016, Wyoming researchers will be able to apply for regular allocations on Cheyenne. Wyoming’s share of Cheyenne will be around 160 million core hours per year. The new high-performance computer will be a 5.34-petaflop system, meaning it can carry out 5.34 quadrillion calculations per second. It will be capable of more than 2.5 times the amount of scientific computing performed by Yellowstone.

The NWSC is the result of a partnership among the University Corporation for Atmospheric Research (UCAR), the operating entity for NCAR; UW; the state of Wyoming; Cheyenne LEADS; the Wyoming Business Council; and Black Hills Energy. The NWSC is operated by NCAR under sponsorship of the NSF.

The NWSC contains one of the world’s most powerful supercomputers dedicated to improving scientific understanding of climate change, severe weather, air quality and other vital atmospheric science and geo-science topics. The center also houses a premier data storage and archival facility that holds historical climate records and other information.


Source: University of Wyoming

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Democratization of HPC Part 3: Ninth Graders Tap HPC in the Cloud to Design Flying Boats

October 18, 2018

This is the third in a series of articles demonstrating the growing acceptance of high-performance computing (HPC) in new user communities and application areas. In this article we present UberCloud use case #208 on how Read more…

By Wolfgang Gentzsch and Håkon Bull Hove

Penguin Computing Launches Consultancy for Piecing AI Strategies Together

October 18, 2018

AI stands before the HPC industry as a beacon of great expectations, yet market research repeatedly shows that AI adoption is commonly stuck in the talking phase, on the near side of a difficult chasm to cross. In respon Read more…

By Tiffany Trader

When Water Quality—Not Quantity—Hinders HPC Cooling

October 18, 2018

Attention has been paid to the sheer quantity of water consumed by supercomputers’ cooling towers – and rightly so, as they can require thousands of gallons per minute to cool. But in the background, another factor can emerge, bottlenecking efficiency and raising costs: water quality. Read more…

By Oliver Peckham

HPE Extreme Performance Solutions

One Small Step Toward Mars: One Giant Leap for Supercomputing

Since the days of the Space Race between the U.S. and the former Soviet Union, we have continually sought ways to perform experiments in space. Read more…

IBM Accelerated Insights

Paper Offers ‘Proof’ of Quantum Advantage on Some Problems

October 18, 2018

Is quantum computing worth all the effort being poured into it or should we just wait for classical computing to catch up? An IBM blog today posed those questions and, you won’t be surprised, offers a firm “it’s wo Read more…

By John Russell

Penguin Computing Launches Consultancy for Piecing AI Strategies Together

October 18, 2018

AI stands before the HPC industry as a beacon of great expectations, yet market research repeatedly shows that AI adoption is commonly stuck in the talking phas Read more…

By Tiffany Trader

When Water Quality—Not Quantity—Hinders HPC Cooling

October 18, 2018

Attention has been paid to the sheer quantity of water consumed by supercomputers’ cooling towers – and rightly so, as they can require thousands of gallons per minute to cool. But in the background, another factor can emerge, bottlenecking efficiency and raising costs: water quality. Read more…

By Oliver Peckham

Paper Offers ‘Proof’ of Quantum Advantage on Some Problems

October 18, 2018

Is quantum computing worth all the effort being poured into it or should we just wait for classical computing to catch up? An IBM blog today posed those questio Read more…

By John Russell

Dell EMC to Supply U Michigan’s Great Lakes Cluster

October 16, 2018

The University of Michigan (U-M) today announced Dell EMC is the lead vendor for U-M’s $4.8 million Great Lakes HPC cluster scheduled for deployment in first Read more…

By John Russell

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

Nvidia Platform Pushes GPUs into Machine Learning, High Performance Data Analytics

October 10, 2018

GPU leader Nvidia, generally associated with deep learning, autonomous vehicles and other higher-end enterprise and scientific workloads (and gaming, of course) Read more…

By Doug Black

Federal Investment in Exascale – What It Really Means

October 10, 2018

Earlier this month, the EuroHPC JU (Joint Undertaking) reached critical mass, and it seems all EU and affiliated member states, bar the UK (unsurprisingly), have or will sign on. The EuroHPC JU was born from a recognition that individual EU member states, and the EU as a whole, were significantly underinvesting in HPC compared to the US, China and Japan, who all have their own exascale investment and delivery strategies (NSCI, 13th 5 Year Plan, Post-K, etc). Read more…

By Dairsie Latimer

NERSC-9 Clues Found in NERSC 2017 Annual Report

October 8, 2018

If you’re eager to find out who’ll supply NERSC’s next-gen supercomputer, codenamed NERSC-9, here’s a project update to tide you over until the winning bid and system details are revealed. The upcoming system is referenced several times in the recently published 2017 NERSC annual report. Read more…

By Tiffany Trader

TACC Wins Next NSF-funded Major Supercomputer

July 30, 2018

The Texas Advanced Computing Center (TACC) has won the next NSF-funded big supercomputer beating out rivals including the National Center for Supercomputing Ap Read more…

By John Russell

IBM at Hot Chips: What’s Next for Power

August 23, 2018

With processor, memory and networking technologies all racing to fill in for an ailing Moore’s law, the era of the heterogeneous datacenter is well underway, Read more…

By Tiffany Trader

Requiem for a Phi: Knights Landing Discontinued

July 25, 2018

On Monday, Intel made public its end of life strategy for the Knights Landing "KNL" Phi product set. The announcement makes official what has already been wide Read more…

By Tiffany Trader

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

By John Russell

D-Wave Breaks New Ground in Quantum Simulation

July 16, 2018

Last Friday D-Wave scientists and colleagues published work in Science which they say represents the first fulfillment of Richard Feynman’s 1982 notion that Read more…

By John Russell

Leading Solution Providers

HPC on Wall Street 2018 Booth Video Tours Playlist

Arista

Dell EMC

IBM

Intel

RStor

VMWare

TACC’s ‘Frontera’ Supercomputer Expands Horizon for Extreme-Scale Science

August 29, 2018

The National Science Foundation and the Texas Advanced Computing Center announced today that a new system, called Frontera, will overtake Stampede 2 as the fast Read more…

By Tiffany Trader

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

Intel Announces Cooper Lake, Advances AI Strategy

August 9, 2018

Intel's chief datacenter exec Navin Shenoy kicked off the company's Data-Centric Innovation Summit Wednesday, the day-long program devoted to Intel's datacenter Read more…

By Tiffany Trader

GPUs Power Five of World’s Top Seven Supercomputers

June 25, 2018

The top 10 echelon of the newly minted Top500 list boasts three powerful new systems with one common engine: the Nvidia Volta V100 general-purpose graphics proc Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

Aerodynamic Simulation Reveals Best Position in a Peloton of Cyclists

July 5, 2018

Eindhoven University of Technology (TU/e) and KU Leuven research group conducts the largest numerical simulation ever done in the sport industry and cycling discipline. The goal was to understand the aerodynamic interactions in the peloton, i.e., the main pack of cyclists in a race. Read more…

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This