The Food Industry’s Next Journey — from Mars to Exascale

By Scott Gibson, Oak Ridge National Laboratory

February 12, 2018

Editor’s note: Global food producer Mars, Incorporated participates in the US Exascale Computing Project’s Industry Council, which formed in February 2017 to facilitate information exchange between the ECP and the industrial user community. In this contributed article, ORNL’s Scott Gibson describes Mars’ efforts to leverage next-generation computing technologies to improve food safety and sustainability and create manufacturing efficiencies.

Mars, the world’s leading chocolate company and one of the largest food manufacturers, has a unique perspective on the impact that exascale computing will have on the food industry.

Creating a Safer and More Sustainable Food Supply Chain

“The food industry needs to address several grand challenges by developing innovative and sustainable solutions at the intersection of food, agriculture and health. Leveraging the power of technology will be critical on this journey. Exascale, for example, is going to be a radical enabler for helping the food, nutrition and agriculture sectors to evolve and possibly even revolutionize themselves to address these grand challenges,” said Harold Schmitz, chief science officer for Mars and director of the Mars Advanced Research Institute. Schmitz is a member of the US Department of Energy’s Exascale Computing Project Industry Council, a group of external advisors from some of the most prominent companies in the United States.

The Exascale Computing Project represents the next frontier in computing. An exascale ecosystem, expected in the 2021 time frame, will provide computational and data analysis performance at least 50 times more powerful than the fastest supercomputers in use today, and will maximize the benefits of high-performance computing (HPC) for many industries. In the case of the food industry, exascale will offer new solutions that can improve food manufacturing practices, yielding safer and more healthful products, more efficient industrial processes and a reduced carbon footprint.

“The power of exascale has the potential to advance the work of a first-of-its-kind effort led by Mars and the IBM Research – Almaden Lab, called the Consortium for Sequencing the Food Supply Chain,” Schmitz said. The consortium is centered on surveillance, risk assessment, and diagnoses of food-borne pathogens, and it is one of the few efforts in the world using the best tools of genomics, biology, and chemistry to understand nutrition, public health, and food safety.

“Although food safety has progressed immensely over the last hundred years—most notably through improvements in shelf life and the addition of macronutrients for preventive health—it remains a major challenge for food manufacturers,” Schmitz said. One in six Americans suffers a food-borne illness each year, and 3,000 of those affected die, according to the US Centers for Disease Control. Across the globe, almost 1 in 10 people fall ill every year from eating contaminated food and 420,000 perish as a result, reports the World Health Organization.

Increased industry and regulatory attention on pathogens such as Salmonella, Campylobacter, Listeria and aflatoxin has led to breakthroughs that make our food safer, but more must be done. Scientists need a method by which they can understand the pathogens in various contexts, including the microbial community, the microbiome and the broader food chain. Going one step further, they need a method that enables them to anticipate how the pathogen would behave in real scenarios, such as: a field where crops are grown and harvested; during travel on various transportation channels; or in factory environments where ingredients are processed.

“The consortium aims to revolutionize our understanding of how to predict pathogen outbreaks and discover what environments stimulate pathogens to behave badly, or what microbial environments are able to keep pathogen outbreaks under control,” Schmitz said. “In essence, we want to sequence the genome of the food supply chain and then use data analytics to understand its microbial community. We’re working at the intersection of HPC and the field of systems biology. In this case, the system is the food supply chain, from farm to fork”

Mars has used genome sequencing to progress its efforts to improve the supply-chain sustainability of one of its key ingredients: cocoa. It is a low-yield crop grown primarily in countries that lack the scientific and technological resources to modernize it.

“We realized we needed to give the most talented agricultural scientists a tool box to make the cocoa crop sustainable,” Schmitz said. That tool box is the genome. So, from 2008 to 2010, Mars, IBM, and the US Department of Agriculture Research Service and several other collaborators sequenced the genome of Theobroma cacao, an economically important tropical fruiting tree that is the source of chocolate.

“Analyzing genomic data allowed us to understand how diverse genotypes of cacao perform in different environments. This information is then used to breed superior varieties, with increased yields, quality and stress tolerance,” said Jim Kennedy, computational science leader at the Mars Advanced Research Institute. “We also use data analytics to understand how genetic and environmental factors contribute to pest and disease losses.  This information is used to develop environmentally friendly strategies to improve crop health.”

“Since our breakthrough on Theobroma cacao, we’ve already seen great improvements in cocoa,” Schmitz said. “When exascale comes online it will introduce food and agriculture data scientists to an exciting new world of opportunity.”

He explained that exascale will provide food data scientists with an unprecedented level of computing power to probe molecular food chemistry in a manner akin to how the pharmaceutical industry uses technology to study protein molecular dynamics.

“Modeling, simulation and data analytics with exascale will inform food design in a way that the empirical method, or trial and error, never could,” Schmitz said. “There is possibility for this to help unlock some of the biggest food and nutritional challenges that we face today.”

Designing More Efficient Manufacturing Processes

The HPC teams at Mars, which partner with DOE National Laboratories to bolster their computational science efforts, use modeling and simulation and data analytics to optimize not only the company’s supply-chain processes but also its design manufacturing processes. The teams employ tools such as computational fluid dynamics, discrete element method, and multiphysics-type deterministic models in HPC.

“We’re applying multiphysics models to better understand some of our essential processes such as extrusion,” Kennedy said. Extrusion is a fundamental process in which product ingredients are fed into a barrel and forced through a screw. The functions of mixing, sterilization, or cooking may take place in the barrel. Mars products such as confection candy, toffee candy, and pet food undergo extrusion.

“If we’re designing a new extrusion process, we’ll use modeling to optimize the design,” Kennedy said. “In the past, we would over-engineer and end up with an extruder that was one-and-a-half times bigger than what we needed. Modeling enables us to understand what the design parameters should be before we cut steel and build anything. But we’ve learned we need more computing power and speed, like what exascale will provide, to handle the complexity of our processes.”

Reducing the Greenhouse Gas Footprint

Exascale will enable the food industry to pioneer more efficient manufacturing processes that use less energy, in turn lessening its environmental impact.

“The food and agriculture sectors are among the largest contributors to climate change and the loss of biodiversity,” Schmitz said. “The energy required in global agriculture, the greenhouse gases emitted, and the vast amount of land used are all contributors. The good news is that the advancements in HPC and the eventual arrival of exascale computing will enable the industry to better use data science advances to improve its environmental and ecological footprint.”

Spreading the Use of Data Science

“The advent of exascale will help spread the use of data science more widely,” Kennedy said. At present, most companies are facing a shortage of data scientists while the need for digitization is expanding. At the same time, companies are trying to automate some of the tasks that would normally require a data scientist, such as cleaning, normalizing, or preprocessing data for analysis, simulation, or modeling.

“Exascale will make it possible for computers to run through scenarios faster and provide the end-user with data output in language that non-experts can understand,” Kennedy said. “Then they can go about slicing and dicing the data to prepare it for simulation. I think exascale will bring that capability to the masses so that they can directly work with their data and gain the insights and ask the questions they need for their research.”

Mars recently confirmed a collaboration agreement with the Joint Institute for Computational Sciences, an Institute of the University of Tennessee and Oak Ridge National Lab. The business plans to leverage the DOE computational infrastructure to find solutions for some of its most complex challenges and opportunities.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire