NCSA Researchers Create Reliable Tool for Long-Term Crop Prediction in the U.S. Corn Belt

February 14, 2018

Feb. 14, 2018 — With the help of the Blue Waters supercomputer, at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign, Blue Waters Professor Kaiyu Guan and NCSA postdoc fellow, Bin Peng implemented and evaluated a new maize growth model. The CLM-APSIM model combines superior features in both Community Land Model (CLM) and Agricultural Production Systems sIMulator (APSIM), creating one of the most reliable tools for long-term crop prediction in the U.S. Corn Belt. Peng and Guan recently published their paper, “Improving maize growth processes in the community land model: Implementation and evaluation” in the Agricultural and Forestry Meteorology journal. This work is an outstanding example of the convergence of simulation and data science that is a driving factor in the National Strategic Computing Initiative announced by the White House in 2015.

Conceptual diagram for phenological stages in the original CLM, APSIM and CLM-APSIM models. Unique features in CLM-APSIM crop model are also highlighted. Note that the stage duration in this diagram is not proportional to real stage length, and only presented for illustrative purpose. Image courtesy of NCSA.

“One class of crop models is agronomy-based and the other is embedded in climate models or earth system models. They are developed for different purposes and applied at different scales,” says Guan. “Because each has its own strengths and weaknesses, our idea is to combine the strengths of both types of models to make a new crop model with improved prediction performance.” Additionally, what makes the new CLM-APSIM model unique is the more detailed phenology stages, an explicit implementation of the impacts of various abiotic environmental stresses (including nitrogen, water, temperature and heat stresses) on maize phenology and carbon allocation, as well as an explicit simulation of grain number.

With support from the NCSA Blue Waters project (funded by the National Science Foundation and Illinois), NASA and the USDA National Institute of Food and Agriculture (NIFA) Foundational Program, Peng and Guan created the prototype for CLM-APSIM. “We built this new tool to bridge these two types of crop models combining their strengths and eliminating the weaknesses.”

The team is currently conducting a high resolution regional simulation over the contiguous United States to simulate corn yield at each planting corner. “There are hundreds of thousands of grids, and we run this model over each grid for 30 years in historical simulation and even more for future projection simulation,” said Peng, “currently it takes us several minutes to calculate one model-year simulation over a single grid. The only way to do this in a timely manner is to use parallel computing with thousands of cores in Blue Waters.”

Peng and Guan examined the results of this tool at seven different locations across the U.S. Corn Belt, revealing that the CLM-APSIM model more accurately predicted and simulated phenology of leaf area index and canopy height, surface fluxes including gross primary production, net ecosystem exchange, latent heat, sensible heat and especially in simulating the biomass partition and maize yield in comparison to the earlier CLM4.5 model. The CLM-APSIM model also corrected a serious deficiency in the original CLM model that underestimated aboveground biomass and overestimated the Harvest Index, which led to a reasonable yield estimation with wrong mechanisms.

Additionally, results from a 13-year simulation (2001-2013) at three sites located in Mead, NE, (US-Ne1, Ne2 and Ne3) show that the CLM-APSIM model can more accurately reproduce maize yield responses to growing season climate (temperature and precipitation) than the original CLM4.5 when benchmarked with the site-based observations and USDA county-level survey statistics.

“We can simulate the past, because we already have the weather datasets, but looking into the next 50 years, how can we understand the effect of climate change? Furthermore, how can we understand what farmers can do to improve and mitigate the climate change impact and improve the yield?” Guan said.

Their hope is to integrate satellite data into the model, similar to that of weather forecasting. “The ultimate goal is to not only have a model, but to forecast in real-time, the crop yields and to project the crop yields decades into the future,” said Guan. “With this technology, we want to not only simulate all the corn in the county of Champaign, Illinois, but everywhere in the U.S. and at a global scale.”

From here, Peng and Guan plan to expand this tool to include other staple crops, such as wheat, rice and soybeans. They are projected to complete a soybean simulation model for the entire United States within the next year.

About NCSA

The National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students, and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50® for more than 30 years by bringing industry, researchers, and students together to solve grand challenges at rapid speed and scale.

About the Blue Waters Project

The Blue Waters petascale supercomputer is one of the most powerful supercomputers in the world, and is the fastest sustained supercomputer on a university campus. Blue Waters uses hundreds of thousands of computational cores to achieve peak performance of more than 13 quadrillion calculations per second. Blue Waters has more memory and faster data storage than any other open system in the world. Scientists and engineers across the country use the computing and data power of Blue Waters to tackle a wide range of challenges. Recent advances that were not possible without these resources include computationally designing the first set of antibody prototypes to detect the Ebola virus, simulating the HIV capsid, visualizing the formation of the first galaxies and exploding stars, and understanding how the layout of a city can impact supercell thunderstorms.


Source: NCSA

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Pfizer HPC Engineer Aims to Automate Software Stack Testing

January 17, 2019

Seeking to reign in the tediousness of manual software testing, Pfizer HPC Engineer Shahzeb Siddiqui is developing an open source software tool called buildtest, aimed at automating software stack testing by providing the community with a central repository of tests for common HPC apps and the ability to automate execution of testing. Read more…

By Tiffany Trader

Senegal Prepares to Take Delivery of Atos Supercomputer

January 16, 2019

In just a few months time, Senegal will be operating the second largest HPC system in sub-Saharan Africa. The Minister of Higher Education, Research and Innovation Mary Teuw Niane made the announcement on Monday (Jan. 14 Read more…

By Tiffany Trader

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three big public cloud vendors has by turn touted the latest and Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

HPE Systems With Intel Omni-Path: Architected for Value and Accessible High-Performance Computing

Today’s high-performance computing (HPC) and artificial intelligence (AI) users value high performing clusters. And the higher the performance that their system can deliver, the better. Read more…

IBM Accelerated Insights

Resource Management in the Age of Artificial Intelligence

New challenges demand fresh approaches

Fueled by GPUs, big data, and rapid advances in software, the AI revolution is upon us. Read more…

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchmark or suite of benchmarking tools to compare the performanc Read more…

By John Russell

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three Read more…

By Tiffany Trader

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchm Read more…

By John Russell

A Big Data Journey While Seeking to Catalog our Universe

January 16, 2019

It turns out, astronomers have lots of photos of the sky but seek knowledge about what the photos mean. Sound familiar? Big data problems are often characterize Read more…

By James Reinders

Intel Bets Big on 2-Track Quantum Strategy

January 15, 2019

Quantum computing has lived so long in the future it’s taken on a futuristic life of its own, with a Gartner-style hype cycle that includes triggers of innovation, inflated expectations and – though a useful quantum system is still years away – anticipatory troughs of disillusionment. Read more…

By Doug Black

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

IBM’s New Global Weather Forecasting System Runs on GPUs

January 9, 2019

Anyone who has checked a forecast to decide whether or not to pack an umbrella knows that weather prediction can be a mercurial endeavor. It is a Herculean task: the constant modeling of incredibly complex systems to a high degree of accuracy at a local level within very short spans of time. Read more…

By Oliver Peckham

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

AMD Sets Up for Epyc Epoch

November 16, 2018

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

Contract Signed for New Finnish Supercomputer

December 13, 2018

After the official contract signing yesterday, configuration details were made public for the new BullSequana system that the Finnish IT Center for Science (CSC Read more…

By Tiffany Trader

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

HPC Reflections and (Mostly Hopeful) Predictions

December 19, 2018

So much ‘spaghetti’ gets tossed on walls by the technology community (vendors and researchers) to see what sticks that it is often difficult to peer through Read more…

By John Russell

Intel Confirms 48-Core Cascade Lake-AP for 2019

November 4, 2018

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

Microsoft to Buy Mellanox?

December 20, 2018

Networking equipment powerhouse Mellanox could be an acquisition target by Microsoft, according to a published report in an Israeli financial publication. Microsoft has reportedly gone so far as to engage Goldman Sachs to handle negotiations with Mellanox. Read more…

By Doug Black

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This