The Food Industry’s Next Journey — from Mars to Exascale

By Scott Gibson, Oak Ridge National Laboratory

February 12, 2018

Editor’s note: Global food producer Mars, Incorporated participates in the US Exascale Computing Project’s Industry Council, which formed in February 2017 to facilitate information exchange between the ECP and the industrial user community. In this contributed article, ORNL’s Scott Gibson describes Mars’ efforts to leverage next-generation computing technologies to improve food safety and sustainability and create manufacturing efficiencies.

Mars, the world’s leading chocolate company and one of the largest food manufacturers, has a unique perspective on the impact that exascale computing will have on the food industry.

Creating a Safer and More Sustainable Food Supply Chain

“The food industry needs to address several grand challenges by developing innovative and sustainable solutions at the intersection of food, agriculture and health. Leveraging the power of technology will be critical on this journey. Exascale, for example, is going to be a radical enabler for helping the food, nutrition and agriculture sectors to evolve and possibly even revolutionize themselves to address these grand challenges,” said Harold Schmitz, chief science officer for Mars and director of the Mars Advanced Research Institute. Schmitz is a member of the US Department of Energy’s Exascale Computing Project Industry Council, a group of external advisors from some of the most prominent companies in the United States.

The Exascale Computing Project represents the next frontier in computing. An exascale ecosystem, expected in the 2021 time frame, will provide computational and data analysis performance at least 50 times more powerful than the fastest supercomputers in use today, and will maximize the benefits of high-performance computing (HPC) for many industries. In the case of the food industry, exascale will offer new solutions that can improve food manufacturing practices, yielding safer and more healthful products, more efficient industrial processes and a reduced carbon footprint.

“The power of exascale has the potential to advance the work of a first-of-its-kind effort led by Mars and the IBM Research – Almaden Lab, called the Consortium for Sequencing the Food Supply Chain,” Schmitz said. The consortium is centered on surveillance, risk assessment, and diagnoses of food-borne pathogens, and it is one of the few efforts in the world using the best tools of genomics, biology, and chemistry to understand nutrition, public health, and food safety.

“Although food safety has progressed immensely over the last hundred years—most notably through improvements in shelf life and the addition of macronutrients for preventive health—it remains a major challenge for food manufacturers,” Schmitz said. One in six Americans suffers a food-borne illness each year, and 3,000 of those affected die, according to the US Centers for Disease Control. Across the globe, almost 1 in 10 people fall ill every year from eating contaminated food and 420,000 perish as a result, reports the World Health Organization.

Increased industry and regulatory attention on pathogens such as Salmonella, Campylobacter, Listeria and aflatoxin has led to breakthroughs that make our food safer, but more must be done. Scientists need a method by which they can understand the pathogens in various contexts, including the microbial community, the microbiome and the broader food chain. Going one step further, they need a method that enables them to anticipate how the pathogen would behave in real scenarios, such as: a field where crops are grown and harvested; during travel on various transportation channels; or in factory environments where ingredients are processed.

“The consortium aims to revolutionize our understanding of how to predict pathogen outbreaks and discover what environments stimulate pathogens to behave badly, or what microbial environments are able to keep pathogen outbreaks under control,” Schmitz said. “In essence, we want to sequence the genome of the food supply chain and then use data analytics to understand its microbial community. We’re working at the intersection of HPC and the field of systems biology. In this case, the system is the food supply chain, from farm to fork”

Mars has used genome sequencing to progress its efforts to improve the supply-chain sustainability of one of its key ingredients: cocoa. It is a low-yield crop grown primarily in countries that lack the scientific and technological resources to modernize it.

“We realized we needed to give the most talented agricultural scientists a tool box to make the cocoa crop sustainable,” Schmitz said. That tool box is the genome. So, from 2008 to 2010, Mars, IBM, and the US Department of Agriculture Research Service and several other collaborators sequenced the genome of Theobroma cacao, an economically important tropical fruiting tree that is the source of chocolate.

“Analyzing genomic data allowed us to understand how diverse genotypes of cacao perform in different environments. This information is then used to breed superior varieties, with increased yields, quality and stress tolerance,” said Jim Kennedy, computational science leader at the Mars Advanced Research Institute. “We also use data analytics to understand how genetic and environmental factors contribute to pest and disease losses.  This information is used to develop environmentally friendly strategies to improve crop health.”

“Since our breakthrough on Theobroma cacao, we’ve already seen great improvements in cocoa,” Schmitz said. “When exascale comes online it will introduce food and agriculture data scientists to an exciting new world of opportunity.”

He explained that exascale will provide food data scientists with an unprecedented level of computing power to probe molecular food chemistry in a manner akin to how the pharmaceutical industry uses technology to study protein molecular dynamics.

“Modeling, simulation and data analytics with exascale will inform food design in a way that the empirical method, or trial and error, never could,” Schmitz said. “There is possibility for this to help unlock some of the biggest food and nutritional challenges that we face today.”

Designing More Efficient Manufacturing Processes

The HPC teams at Mars, which partner with DOE National Laboratories to bolster their computational science efforts, use modeling and simulation and data analytics to optimize not only the company’s supply-chain processes but also its design manufacturing processes. The teams employ tools such as computational fluid dynamics, discrete element method, and multiphysics-type deterministic models in HPC.

“We’re applying multiphysics models to better understand some of our essential processes such as extrusion,” Kennedy said. Extrusion is a fundamental process in which product ingredients are fed into a barrel and forced through a screw. The functions of mixing, sterilization, or cooking may take place in the barrel. Mars products such as confection candy, toffee candy, and pet food undergo extrusion.

“If we’re designing a new extrusion process, we’ll use modeling to optimize the design,” Kennedy said. “In the past, we would over-engineer and end up with an extruder that was one-and-a-half times bigger than what we needed. Modeling enables us to understand what the design parameters should be before we cut steel and build anything. But we’ve learned we need more computing power and speed, like what exascale will provide, to handle the complexity of our processes.”

Reducing the Greenhouse Gas Footprint

Exascale will enable the food industry to pioneer more efficient manufacturing processes that use less energy, in turn lessening its environmental impact.

“The food and agriculture sectors are among the largest contributors to climate change and the loss of biodiversity,” Schmitz said. “The energy required in global agriculture, the greenhouse gases emitted, and the vast amount of land used are all contributors. The good news is that the advancements in HPC and the eventual arrival of exascale computing will enable the industry to better use data science advances to improve its environmental and ecological footprint.”

Spreading the Use of Data Science

“The advent of exascale will help spread the use of data science more widely,” Kennedy said. At present, most companies are facing a shortage of data scientists while the need for digitization is expanding. At the same time, companies are trying to automate some of the tasks that would normally require a data scientist, such as cleaning, normalizing, or preprocessing data for analysis, simulation, or modeling.

“Exascale will make it possible for computers to run through scenarios faster and provide the end-user with data output in language that non-experts can understand,” Kennedy said. “Then they can go about slicing and dicing the data to prepare it for simulation. I think exascale will bring that capability to the masses so that they can directly work with their data and gain the insights and ask the questions they need for their research.”

Mars recently confirmed a collaboration agreement with the Joint Institute for Computational Sciences, an Institute of the University of Tennessee and Oak Ridge National Lab. The business plans to leverage the DOE computational infrastructure to find solutions for some of its most complex challenges and opportunities.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Democratization of HPC Part 3: Ninth Graders Tap HPC in the Cloud to Design Flying Boats

October 18, 2018

This is the third in a series of articles demonstrating the growing acceptance of high-performance computing (HPC) in new user communities and application areas. In this article we present UberCloud use case #208 on how Read more…

By Wolfgang Gentzsch and Håkon Bull Hove

Penguin Computing Launches Consultancy for Piecing AI Strategies Together

October 18, 2018

AI stands before the HPC industry as a beacon of great expectations, yet market research repeatedly shows that AI adoption is commonly stuck in the talking phase, on the near side of a difficult chasm to cross. In respon Read more…

By Tiffany Trader

When Water Quality—Not Quantity—Hinders HPC Cooling

October 18, 2018

Attention has been paid to the sheer quantity of water consumed by supercomputers’ cooling towers – and rightly so, as they can require thousands of gallons per minute to cool. But in the background, another factor can emerge, bottlenecking efficiency and raising costs: water quality. Read more…

By Oliver Peckham

HPE Extreme Performance Solutions

One Small Step Toward Mars: One Giant Leap for Supercomputing

Since the days of the Space Race between the U.S. and the former Soviet Union, we have continually sought ways to perform experiments in space. Read more…

IBM Accelerated Insights

Paper Offers ‘Proof’ of Quantum Advantage on Some Problems

October 18, 2018

Is quantum computing worth all the effort being poured into it or should we just wait for classical computing to catch up? An IBM blog today posed those questions and, you won’t be surprised, offers a firm “it’s wo Read more…

By John Russell

Penguin Computing Launches Consultancy for Piecing AI Strategies Together

October 18, 2018

AI stands before the HPC industry as a beacon of great expectations, yet market research repeatedly shows that AI adoption is commonly stuck in the talking phas Read more…

By Tiffany Trader

When Water Quality—Not Quantity—Hinders HPC Cooling

October 18, 2018

Attention has been paid to the sheer quantity of water consumed by supercomputers’ cooling towers – and rightly so, as they can require thousands of gallons per minute to cool. But in the background, another factor can emerge, bottlenecking efficiency and raising costs: water quality. Read more…

By Oliver Peckham

Paper Offers ‘Proof’ of Quantum Advantage on Some Problems

October 18, 2018

Is quantum computing worth all the effort being poured into it or should we just wait for classical computing to catch up? An IBM blog today posed those questio Read more…

By John Russell

Dell EMC to Supply U Michigan’s Great Lakes Cluster

October 16, 2018

The University of Michigan (U-M) today announced Dell EMC is the lead vendor for U-M’s $4.8 million Great Lakes HPC cluster scheduled for deployment in first Read more…

By John Russell

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

Nvidia Platform Pushes GPUs into Machine Learning, High Performance Data Analytics

October 10, 2018

GPU leader Nvidia, generally associated with deep learning, autonomous vehicles and other higher-end enterprise and scientific workloads (and gaming, of course) Read more…

By Doug Black

Federal Investment in Exascale – What It Really Means

October 10, 2018

Earlier this month, the EuroHPC JU (Joint Undertaking) reached critical mass, and it seems all EU and affiliated member states, bar the UK (unsurprisingly), have or will sign on. The EuroHPC JU was born from a recognition that individual EU member states, and the EU as a whole, were significantly underinvesting in HPC compared to the US, China and Japan, who all have their own exascale investment and delivery strategies (NSCI, 13th 5 Year Plan, Post-K, etc). Read more…

By Dairsie Latimer

NERSC-9 Clues Found in NERSC 2017 Annual Report

October 8, 2018

If you’re eager to find out who’ll supply NERSC’s next-gen supercomputer, codenamed NERSC-9, here’s a project update to tide you over until the winning bid and system details are revealed. The upcoming system is referenced several times in the recently published 2017 NERSC annual report. Read more…

By Tiffany Trader

TACC Wins Next NSF-funded Major Supercomputer

July 30, 2018

The Texas Advanced Computing Center (TACC) has won the next NSF-funded big supercomputer beating out rivals including the National Center for Supercomputing Ap Read more…

By John Russell

IBM at Hot Chips: What’s Next for Power

August 23, 2018

With processor, memory and networking technologies all racing to fill in for an ailing Moore’s law, the era of the heterogeneous datacenter is well underway, Read more…

By Tiffany Trader

Requiem for a Phi: Knights Landing Discontinued

July 25, 2018

On Monday, Intel made public its end of life strategy for the Knights Landing "KNL" Phi product set. The announcement makes official what has already been wide Read more…

By Tiffany Trader

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

By John Russell

AMD’s EPYC Road to Redemption in Six Slides

June 21, 2018

A year ago AMD returned to the server market with its EPYC processor line. The earth didn’t tremble but folks took notice. People remember the Opteron fondly Read more…

By John Russell

Leading Solution Providers

HPC on Wall Street 2018 Booth Video Tours Playlist

Arista

Dell EMC

IBM

Intel

RStor

VMWare

D-Wave Breaks New Ground in Quantum Simulation

July 16, 2018

Last Friday D-Wave scientists and colleagues published work in Science which they say represents the first fulfillment of Richard Feynman’s 1982 notion that Read more…

By John Russell

TACC’s ‘Frontera’ Supercomputer Expands Horizon for Extreme-Scale Science

August 29, 2018

The National Science Foundation and the Texas Advanced Computing Center announced today that a new system, called Frontera, will overtake Stampede 2 as the fast Read more…

By Tiffany Trader

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

Intel Announces Cooper Lake, Advances AI Strategy

August 9, 2018

Intel's chief datacenter exec Navin Shenoy kicked off the company's Data-Centric Innovation Summit Wednesday, the day-long program devoted to Intel's datacenter Read more…

By Tiffany Trader

GPUs Power Five of World’s Top Seven Supercomputers

June 25, 2018

The top 10 echelon of the newly minted Top500 list boasts three powerful new systems with one common engine: the Nvidia Volta V100 general-purpose graphics proc Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

Aerodynamic Simulation Reveals Best Position in a Peloton of Cyclists

July 5, 2018

Eindhoven University of Technology (TU/e) and KU Leuven research group conducts the largest numerical simulation ever done in the sport industry and cycling discipline. The goal was to understand the aerodynamic interactions in the peloton, i.e., the main pack of cyclists in a race. Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This