The Food Industry’s Next Journey — from Mars to Exascale

By Scott Gibson, Oak Ridge National Laboratory

February 12, 2018

Editor’s note: Global food producer Mars, Incorporated participates in the US Exascale Computing Project’s Industry Council, which formed in February 2017 to facilitate information exchange between the ECP and the industrial user community. In this contributed article, ORNL’s Scott Gibson describes Mars’ efforts to leverage next-generation computing technologies to improve food safety and sustainability and create manufacturing efficiencies.

Mars, the world’s leading chocolate company and one of the largest food manufacturers, has a unique perspective on the impact that exascale computing will have on the food industry.

Creating a Safer and More Sustainable Food Supply Chain

“The food industry needs to address several grand challenges by developing innovative and sustainable solutions at the intersection of food, agriculture and health. Leveraging the power of technology will be critical on this journey. Exascale, for example, is going to be a radical enabler for helping the food, nutrition and agriculture sectors to evolve and possibly even revolutionize themselves to address these grand challenges,” said Harold Schmitz, chief science officer for Mars and director of the Mars Advanced Research Institute. Schmitz is a member of the US Department of Energy’s Exascale Computing Project Industry Council, a group of external advisors from some of the most prominent companies in the United States.

The Exascale Computing Project represents the next frontier in computing. An exascale ecosystem, expected in the 2021 time frame, will provide computational and data analysis performance at least 50 times more powerful than the fastest supercomputers in use today, and will maximize the benefits of high-performance computing (HPC) for many industries. In the case of the food industry, exascale will offer new solutions that can improve food manufacturing practices, yielding safer and more healthful products, more efficient industrial processes and a reduced carbon footprint.

“The power of exascale has the potential to advance the work of a first-of-its-kind effort led by Mars and the IBM Research – Almaden Lab, called the Consortium for Sequencing the Food Supply Chain,” Schmitz said. The consortium is centered on surveillance, risk assessment, and diagnoses of food-borne pathogens, and it is one of the few efforts in the world using the best tools of genomics, biology, and chemistry to understand nutrition, public health, and food safety.

“Although food safety has progressed immensely over the last hundred years—most notably through improvements in shelf life and the addition of macronutrients for preventive health—it remains a major challenge for food manufacturers,” Schmitz said. One in six Americans suffers a food-borne illness each year, and 3,000 of those affected die, according to the US Centers for Disease Control. Across the globe, almost 1 in 10 people fall ill every year from eating contaminated food and 420,000 perish as a result, reports the World Health Organization.

Increased industry and regulatory attention on pathogens such as Salmonella, Campylobacter, Listeria and aflatoxin has led to breakthroughs that make our food safer, but more must be done. Scientists need a method by which they can understand the pathogens in various contexts, including the microbial community, the microbiome and the broader food chain. Going one step further, they need a method that enables them to anticipate how the pathogen would behave in real scenarios, such as: a field where crops are grown and harvested; during travel on various transportation channels; or in factory environments where ingredients are processed.

“The consortium aims to revolutionize our understanding of how to predict pathogen outbreaks and discover what environments stimulate pathogens to behave badly, or what microbial environments are able to keep pathogen outbreaks under control,” Schmitz said. “In essence, we want to sequence the genome of the food supply chain and then use data analytics to understand its microbial community. We’re working at the intersection of HPC and the field of systems biology. In this case, the system is the food supply chain, from farm to fork”

Mars has used genome sequencing to progress its efforts to improve the supply-chain sustainability of one of its key ingredients: cocoa. It is a low-yield crop grown primarily in countries that lack the scientific and technological resources to modernize it.

“We realized we needed to give the most talented agricultural scientists a tool box to make the cocoa crop sustainable,” Schmitz said. That tool box is the genome. So, from 2008 to 2010, Mars, IBM, and the US Department of Agriculture Research Service and several other collaborators sequenced the genome of Theobroma cacao, an economically important tropical fruiting tree that is the source of chocolate.

“Analyzing genomic data allowed us to understand how diverse genotypes of cacao perform in different environments. This information is then used to breed superior varieties, with increased yields, quality and stress tolerance,” said Jim Kennedy, computational science leader at the Mars Advanced Research Institute. “We also use data analytics to understand how genetic and environmental factors contribute to pest and disease losses.  This information is used to develop environmentally friendly strategies to improve crop health.”

“Since our breakthrough on Theobroma cacao, we’ve already seen great improvements in cocoa,” Schmitz said. “When exascale comes online it will introduce food and agriculture data scientists to an exciting new world of opportunity.”

He explained that exascale will provide food data scientists with an unprecedented level of computing power to probe molecular food chemistry in a manner akin to how the pharmaceutical industry uses technology to study protein molecular dynamics.

“Modeling, simulation and data analytics with exascale will inform food design in a way that the empirical method, or trial and error, never could,” Schmitz said. “There is possibility for this to help unlock some of the biggest food and nutritional challenges that we face today.”

Designing More Efficient Manufacturing Processes

The HPC teams at Mars, which partner with DOE National Laboratories to bolster their computational science efforts, use modeling and simulation and data analytics to optimize not only the company’s supply-chain processes but also its design manufacturing processes. The teams employ tools such as computational fluid dynamics, discrete element method, and multiphysics-type deterministic models in HPC.

“We’re applying multiphysics models to better understand some of our essential processes such as extrusion,” Kennedy said. Extrusion is a fundamental process in which product ingredients are fed into a barrel and forced through a screw. The functions of mixing, sterilization, or cooking may take place in the barrel. Mars products such as confection candy, toffee candy, and pet food undergo extrusion.

“If we’re designing a new extrusion process, we’ll use modeling to optimize the design,” Kennedy said. “In the past, we would over-engineer and end up with an extruder that was one-and-a-half times bigger than what we needed. Modeling enables us to understand what the design parameters should be before we cut steel and build anything. But we’ve learned we need more computing power and speed, like what exascale will provide, to handle the complexity of our processes.”

Reducing the Greenhouse Gas Footprint

Exascale will enable the food industry to pioneer more efficient manufacturing processes that use less energy, in turn lessening its environmental impact.

“The food and agriculture sectors are among the largest contributors to climate change and the loss of biodiversity,” Schmitz said. “The energy required in global agriculture, the greenhouse gases emitted, and the vast amount of land used are all contributors. The good news is that the advancements in HPC and the eventual arrival of exascale computing will enable the industry to better use data science advances to improve its environmental and ecological footprint.”

Spreading the Use of Data Science

“The advent of exascale will help spread the use of data science more widely,” Kennedy said. At present, most companies are facing a shortage of data scientists while the need for digitization is expanding. At the same time, companies are trying to automate some of the tasks that would normally require a data scientist, such as cleaning, normalizing, or preprocessing data for analysis, simulation, or modeling.

“Exascale will make it possible for computers to run through scenarios faster and provide the end-user with data output in language that non-experts can understand,” Kennedy said. “Then they can go about slicing and dicing the data to prepare it for simulation. I think exascale will bring that capability to the masses so that they can directly work with their data and gain the insights and ask the questions they need for their research.”

Mars recently confirmed a collaboration agreement with the Joint Institute for Computational Sciences, an Institute of the University of Tennessee and Oak Ridge National Lab. The business plans to leverage the DOE computational infrastructure to find solutions for some of its most complex challenges and opportunities.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

D-Wave Delivers 5000-qubit System; Targets Quantum Advantage

September 29, 2020

D-Wave today launched its newest and largest quantum annealing computer, a 5000-qubit goliath named Advantage that features 15-way qubit interconnectivity. It also introduced the D-Wave Launch program intended to jump st Read more…

By John Russell

What’s New in Computing vs. COVID-19: AMD, Remdesivir, Fab Spending & More

September 29, 2020

Supercomputing, big data and artificial intelligence are crucial tools in the fight against the coronavirus pandemic. Around the world, researchers, corporations and governments are urgently devoting their computing reso Read more…

By Oliver Peckham

Global QC Market Projected to Grow to More Than $800 million by 2024

September 28, 2020

The Quantum Economic Development Consortium (QED-C) and Hyperion Research are projecting that the global quantum computing (QC) market - worth an estimated $320 million in 2020 - will grow at an anticipated 27% CAGR betw Read more…

By Staff Reports

DoE’s ASCAC Backs AI for Science Program that Emulates the Exascale Initiative

September 28, 2020

Roughly a year after beginning formal efforts to explore an AI for Science initiative the Department of Energy’s Advanced Scientific Computing Advisory Committee last week accepted a subcommittee report calling for a t Read more…

By John Russell

Supercomputer Research Aims to Supercharge COVID-19 Antiviral Remdesivir

September 25, 2020

Remdesivir is one of a handful of therapeutic antiviral drugs that have been proven to improve outcomes for COVID-19 patients, and as such, is a crucial weapon in the fight against the pandemic – especially in the abse Read more…

By Oliver Peckham

AWS Solution Channel

The Water Institute of the Gulf runs compute-heavy storm surge and wave simulations on AWS

The Water Institute of the Gulf (Water Institute) runs its storm surge and wave analysis models on Amazon Web Services (AWS)—a task that sometimes requires large bursts of compute power. Read more…

Intel® HPC + AI Pavilion

Berlin Institute of Health: Putting HPC to Work for the World

Researchers from the Center for Digital Health at the Berlin Institute of Health (BIH) are using science to understand the pathophysiology of COVID-19, which can help to inform the development of targeted treatments. Read more…

NOAA Announces Major Upgrade to Ensemble Forecast Model, Extends Range to 35 Days

September 23, 2020

A bit over a year ago, the United States’ Global Forecast System (GFS) received a major upgrade: a new dynamical core – its first in 40 years – called the finite-volume cubed-sphere, or FV3. Now, the National Oceanic and Atmospheric Administration (NOAA) is bringing the FV3 dynamical core to... Read more…

By Oliver Peckham

D-Wave Delivers 5000-qubit System; Targets Quantum Advantage

September 29, 2020

D-Wave today launched its newest and largest quantum annealing computer, a 5000-qubit goliath named Advantage that features 15-way qubit interconnectivity. It a Read more…

By John Russell

DoE’s ASCAC Backs AI for Science Program that Emulates the Exascale Initiative

September 28, 2020

Roughly a year after beginning formal efforts to explore an AI for Science initiative the Department of Energy’s Advanced Scientific Computing Advisory Commit Read more…

By John Russell

NOAA Announces Major Upgrade to Ensemble Forecast Model, Extends Range to 35 Days

September 23, 2020

A bit over a year ago, the United States’ Global Forecast System (GFS) received a major upgrade: a new dynamical core – its first in 40 years – called the finite-volume cubed-sphere, or FV3. Now, the National Oceanic and Atmospheric Administration (NOAA) is bringing the FV3 dynamical core to... Read more…

By Oliver Peckham

Arm Targets HPC with New Neoverse Platforms

September 22, 2020

UK-based semiconductor design company Arm today teased details of its Neoverse roadmap, introducing V1 (codenamed Zeus) and N2 (codenamed Perseus), Arm’s second generation N-series platform. The chip IP vendor said the new platforms will deliver 50 percent and 40 percent more... Read more…

By Tiffany Trader

Oracle Cloud Deepens HPC Embrace with Launch of A100 Instances, Plans for Arm, More 

September 22, 2020

Oracle Cloud Infrastructure (OCI) continued its steady ramp-up of HPC capabilities today with a flurry of announcements. Topping the list is general availabilit Read more…

By John Russell

European Commission Declares €8 Billion Investment in Supercomputing

September 18, 2020

Just under two years ago, the European Commission formalized the EuroHPC Joint Undertaking (JU): a concerted HPC effort (comprising 32 participating states at c Read more…

By Oliver Peckham

Google Hires Longtime Intel Exec Bill Magro to Lead HPC Strategy

September 18, 2020

In a sign of the times, another prominent HPCer has made a move to a hyperscaler. Longtime Intel executive Bill Magro joined Google as chief technologist for hi Read more…

By Tiffany Trader

Future of Fintech on Display at HPC + AI Wall Street

September 17, 2020

Those who tuned in for Tuesday's HPC + AI Wall Street event got a peak at the future of fintech and lively discussion of topics like blockchain, AI for risk man Read more…

By Alex Woodie, Tiffany Trader and Todd R. Weiss

Supercomputer-Powered Research Uncovers Signs of ‘Bradykinin Storm’ That May Explain COVID-19 Symptoms

July 28, 2020

Doctors and medical researchers have struggled to pinpoint – let alone explain – the deluge of symptoms induced by COVID-19 infections in patients, and what Read more…

By Oliver Peckham

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

Google Hires Longtime Intel Exec Bill Magro to Lead HPC Strategy

September 18, 2020

In a sign of the times, another prominent HPCer has made a move to a hyperscaler. Longtime Intel executive Bill Magro joined Google as chief technologist for hi Read more…

By Tiffany Trader

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer

June 9, 2020

Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…

By Tiffany Trader

European Commission Declares €8 Billion Investment in Supercomputing

September 18, 2020

Just under two years ago, the European Commission formalized the EuroHPC Joint Undertaking (JU): a concerted HPC effort (comprising 32 participating states at c Read more…

By Oliver Peckham

Leading Solution Providers

Contributors

Oracle Cloud Infrastructure Powers Fugaku’s Storage, Scores IO500 Win

August 28, 2020

In June, RIKEN shook the supercomputing world with its Arm-based, Fujitsu-built juggernaut: Fugaku. The system, which weighs in at 415.5 Linpack petaflops, topp Read more…

By Oliver Peckham

Google Cloud Debuts 16-GPU Ampere A100 Instances

July 7, 2020

On the heels of the Nvidia’s Ampere A100 GPU launch in May, Google Cloud is announcing alpha availability of the A100 “Accelerator Optimized” VM A2 instance family on Google Compute Engine. The instances are powered by the HGX A100 16-GPU platform, which combines two HGX A100 8-GPU baseboards using... Read more…

By Tiffany Trader

DOD Orders Two AI-Focused Supercomputers from Liqid

August 24, 2020

The U.S. Department of Defense is making a big investment in data analytics and AI computing with the procurement of two HPC systems that will provide the High Read more…

By Tiffany Trader

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

Microsoft Azure Adds A100 GPU Instances for ‘Supercomputer-Class AI’ in the Cloud

August 19, 2020

Microsoft Azure continues to infuse its cloud platform with HPC- and AI-directed technologies. Today the cloud services purveyor announced a new virtual machine Read more…

By Tiffany Trader

Japan’s Fugaku Tops Global Supercomputing Rankings

June 22, 2020

A new Top500 champ was unveiled today. Supercomputer Fugaku, the pride of Japan and the namesake of Mount Fuji, vaulted to the top of the 55th edition of the To Read more…

By Tiffany Trader

Joliot-Curie Supercomputer Used to Build First Full, High-Fidelity Aircraft Engine Simulation

July 14, 2020

When industrial designers plan the design of a new element of a vehicle’s propulsion or exterior, they typically use fluid dynamics to optimize airflow and in Read more…

By Oliver Peckham

Intel Speeds NAMD by 1.8x: Saves Xeon Processor Users Millions of Compute Hours

August 12, 2020

Potentially saving datacenters millions of CPU node hours, Intel and the University of Illinois at Urbana–Champaign (UIUC) have collaborated to develop AVX-512 optimizations for the NAMD scalable molecular dynamics code. These optimizations will be incorporated into release 2.15 with patches available for earlier versions. Read more…

By Rob Farber

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This