Nvidia’s Digital Twin Moonshot – Saving Planet Earth

By Todd R. Weiss

November 11, 2021

Nvidia is continuing to expand enterprise digital twin capabilities and features in its Nvidia Omniverse 3D virtual world design platform, but the company’s nascent plans for its grandest digital twin yet might have even bigger impacts after its announcement at this week’s Nvidia GTC21 virtual conference.

The GPU-maker wants to accelerate its digital twin dreams from helping enterprises model products, factories, assembly lines and more – and rev it up to create the grandest digital twin so far – of the planet Earth itself that will continuously model, predict and track climate change in real-time so scientists can seek ways to reverse or stop the destructive effects of its spread.

“We will build a digital twin to simulate and predict climate change,” said Nvidia CEO Jensen Huang during his Nov. 9 keynote at the virtual event. The work will be done using a powerful new supercomputer that the company is now building, called Earth-2, which will run AI physics created by the new Nvidia Modulus AI framework at million-X speeds on the Nvidia Omniverse platform, he said.

“All the technologies we have invented up to this moment are needed to make Earth-2 possible,” said Huang. “I cannot imagine a greater and more important use.”

The Nvidia Modulus AI framework is powered by physics machine learning models that can build neural network models of industrial digital twins which are used by enterprises for a wide range of development and business tasks, as well as for climate science, protein engineering and more, according to Nvidia.

Digital twins allow data scientists and researchers to conduct experiments and see results by modeling ideas on virtual representations of actual factories, industrial facilities and other physical locations, infrastructure or products, instead of using real-world facilities or products. By using visual representations of real-world items or facilities, development costs and complexity can be reduced, and initial development can be done with far less effort. Digital twin modeling can help a wide range of problems, from a molecular level in drug discovery up to global challenges like climate change, according to Nvidia.

In a telephone briefing with technology journalists on Nov. 10, Huang said that Earth-2 will be fully-funded by Nvidia and will add more depth to the company’s existing supercomputer accomplishments, which also include Cambridge-1 and Selene machines.

Nvidia has not yet revealed the architecture or the ultimate location for the installation of the Earth-2 machine, but it will be announced in the future, said Huang. The upcoming machine will allow Nvidia to “create the most energy efficient supercomputer ever created,” he claimed. “It will be incredibly powerful, and it will be a supercomputer that is designed for Omniverse, because if you imagine Earth as a physical thing, this will be the engine of alternate worlds,” he said of the company’s grandiose plans.

New Omniverse Refinements Unveiled at GTC21

The digital twin of Earth will be possible due to ongoing improvements in the company’s Omniverse platform, which were announced by Nvidia this week at GTC21. The Omniverse platform debuted in beta in December 2020.

Among the new features are augmented reality, virtual reality and multi-GPU rendering, as well as integrations for infrastructure and industrial digital-twin software applications from Bentley Systems and Esri, according to Nvidia.

Other new features include Omniverse Replicator, an engine that generates synthetic data for training deep neural networks, and Omniverse Avatar, which connects Nvidia technologies in speech AI, computer vision, natural language understanding, recommendation engines and simulation technologies to generate interactive AI avatars.

Also new is the integration of Nvidia CloudXR, an enterprise-class immersive streaming framework, into the Omniverse Kit toolkit, which enables developers to build native Omniverse applications and microservices that will allow users to interactively stream Omniverse experiences to their mobile AR and VR devices.

Another new feature, Omniverse VR, introduces full-image, real-time ray-traced VR that will enable developers to build their own VR-capable tools on the platform for the use of end-users. In addition, Omniverse XR Remote provides AR capabilities and virtual cameras, enabling designers to view their assets fully ray traced through iOS and Android devices.

“We now have the technology to create new 3D worlds or model our physical world,” said Huang. “Creators will make more things in virtual worlds than they do in the physical world. We built Omniverse for builders of these virtual worlds.”

Nvidia sees the promise of the Omniverse platform as one that will potentially revolutionize how 40 million 3D designers around the world collaborate, said Huang.

“Companies can build virtual factories and operate them with virtual robots in Omniverse,” he said. “The virtual factories and robots are the digital twins of their physical replica. The physical version is the replica of the digital version since they are produced from the digital original. Omniverse digital twins are where we will design, train, and continuously monitor robotic buildings, factories, warehouses and cars of the future.”

One example of this today is being explored by telecommunications equipment maker, Ericsson, which is using Omniverse to build a digital twin of a city to configure, operate and optimize its fleet of 5G antennas and radios.

“There are 15 million 5G microcells and towers planned for global deployment in the next five years,” said Huang. “Ericsson is using Nvidia Omniverse for building digital twin environments to help determine how to place and configure each of their sites for the best coverage and network performance.”

R. Ray Wang, principal analyst with Constellation Research, told EnterpriseAI that the continuing improvements in Omniverse digital twin capabilities are intriguing for enterprises.

“People are looking at NFTs (non-fungible tokens for blockchain use), but digital twins are much more important,” said Wang. “These are strategic. They are simulation models that are running side-by-side with your analog network. These updates are what customers are looking for. It is the ability to get better simulation capabilities.”

This story first appeared on HPCwire sister site EnterpriseAI.news.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Bill Gropp on ‘Different Approaches to AI’

November 6, 2024

Around this same time last year, I expounded on what the “Future of AI” may entail. A lot has happened in the 12 months since then, including new approaches, new trends and, yes, new complications. A lot of the ne Read more…

Google Cloud Sporting a New Look in HPC and AI Hardware

November 5, 2024

It's raining hardware at Google Cloud, with the company making major upgrades in advance of bringing Nvidia's Blackwell GPUs into its fold next year. The upgrades announced in late October include a preview of its new Read more…

Go (Mountain) West, Quantum Workers! CU, CUbit, and Elevate Quantum Issue Workforce Roadmap

November 5, 2024

Last week the University of Colorado (Boulder), the CUbit Quantum Initiative, and the Elevate Quantum consortium released workforce roadmap for educating and building a quantum workforce. “This roadmap provides a foun Read more…

Collaboration Speeds Complex Chemical Modeling

November 4, 2024

A recent collaboration among researchers from HUN-REN Wigner Research Centre for Physics in Hungary and the Department of Energy's Pacific Northwest National Laboratory (PNNL), along with industry collaborators SandboxAQ Read more…

High-Performance Storage for AI and Analytics Panel

October 31, 2024

When storage is mentioned in an AI or Big Data analytics context, it is assumed to be a high-performance system. In practice, it may not be, and the user eventually learns about scaleable storage as the amounts of data g Read more…

Microsoft Azure & AMD Solution Channel

Join Microsoft Azure and AMD at SC24

Atlanta, Georgia is the place to be this fall as the high-performance computing (HPC) community convenes for Supercomputing 2024. SC24 will bring together an unparalleled mix of scientists, engineers, researchers, educators, programmers, and developers for a week of learning and sharing. Read more…

White House Mulls Expanding AI Chip Export Bans Beyond China

October 31, 2024

The Biden administration is reportedly considering capping sales of advanced artificial intelligence (AI) chips from US-based manufacturers like AMD and Nvidia to certain countries, including those in the Middle East. � Read more…

Bill Gropp on ‘Different Approaches to AI’

November 6, 2024

Around this same time last year, I expounded on what the “Future of AI” may entail. A lot has happened in the 12 months since then, including new approaches Read more…

Shutterstock 1179408610

Google Cloud Sporting a New Look in HPC and AI Hardware

November 5, 2024

It's raining hardware at Google Cloud, with the company making major upgrades in advance of bringing Nvidia's Blackwell GPUs into its fold next year. The upg Read more…

Go (Mountain) West, Quantum Workers! CU, CUbit, and Elevate Quantum Issue Workforce Roadmap

November 5, 2024

Last week the University of Colorado (Boulder), the CUbit Quantum Initiative, and the Elevate Quantum consortium released workforce roadmap for educating and bu Read more…

Collaboration Speeds Complex Chemical Modeling

November 4, 2024

A recent collaboration among researchers from HUN-REN Wigner Research Centre for Physics in Hungary and the Department of Energy's Pacific Northwest National La Read more…

High-Performance Storage for AI and Analytics Panel

October 31, 2024

When storage is mentioned in an AI or Big Data analytics context, it is assumed to be a high-performance system. In practice, it may not be, and the user eventu Read more…

Shutterstock_556401859

Role Reversal: Google Teases Nvidia’s Blackwell as It Softens TPU Rivalry

October 30, 2024

Customers now have access to Google's homegrown hardware -- its Axion CPU and latest Trillium TPU -- in its Cloud service.  At the same time, Google gave custo Read more…

AI Has a Data Problem, Appen Report Says

October 30, 2024

AI may be a priority at American companies, but the difficulty in managing data and obtaining high quality data to train AI models is becoming a bigger hurdle t Read more…

Report from HALO Details Issues Facing HPC-AI Industry

October 28, 2024

Intersect360 Research has released a comprehensive new report concerning the challenges facing the combined fields of high-performance computing (HPC) and artif Read more…

Shutterstock_2176157037

Intel’s Falcon Shores Future Looks Bleak as It Concedes AI Training to GPU Rivals

September 17, 2024

Intel's Falcon Shores future looks bleak as it concedes AI training to GPU rivals On Monday, Intel sent a letter to employees detailing its comeback plan after Read more…

Granite Rapids HPC Benchmarks: I’m Thinking Intel Is Back (Updated)

September 25, 2024

Waiting is the hardest part. In the fall of 2023, HPCwire wrote about the new diverging Xeon processor strategy from Intel. Instead of a on-size-fits all approa Read more…

Ansys Fluent® Adds AMD Instinct™ MI200 and MI300 Acceleration to Power CFD Simulations

September 23, 2024

Ansys Fluent® is well-known in the commercial computational fluid dynamics (CFD) space and is praised for its versatility as a general-purpose solver. Its impr Read more…

xAI Colossus: The Elon Project

September 5, 2024

Elon Musk's xAI cluster, named Colossus (possibly after the 1970 movie about a massive computer that does not end well), has been brought online. Musk recently Read more…

Shutterstock 1024337068

Researchers Benchmark Nvidia’s GH200 Supercomputing Chips

September 4, 2024

Nvidia is putting its GH200 chips in European supercomputers, and researchers are getting their hands on those systems and releasing research papers with perfor Read more…

AMD Clears Up Messy GPU Roadmap, Upgrades Chips Annually

June 3, 2024

In the world of AI, there's a desperate search for an alternative to Nvidia's GPUs, and AMD is stepping up to the plate. AMD detailed its updated GPU roadmap, w Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Nvidia Shipped 3.76 Million Data-center GPUs in 2023, According to Study

June 10, 2024

Nvidia had an explosive 2023 in data-center GPU shipments, which totaled roughly 3.76 million units, according to a study conducted by semiconductor analyst fir Read more…

Leading Solution Providers

Contributors

IBM Develops New Quantum Benchmarking Tool — Benchpress

September 26, 2024

Benchmarking is an important topic in quantum computing. There’s consensus it’s needed but opinions vary widely on how to go about it. Last week, IBM introd Read more…

Intel Customizing Granite Rapids Server Chips for Nvidia GPUs

September 25, 2024

Intel is now customizing its latest Xeon 6 server chips for use with Nvidia's GPUs that dominate the AI landscape. The chipmaker's new Xeon 6 chips, also called Read more…

Zapata Computing, Early Quantum-AI Software Specialist, Ceases Operations

October 14, 2024

Zapata Computing, which was founded in 2017 as a Harvard spinout specializing in quantum software and later pivoted to an AI focus, is ceasing operations, accor Read more…

Quantum and AI: Navigating the Resource Challenge

September 18, 2024

Rapid advancements in quantum computing are bringing a new era of technological possibilities. However, as quantum technology progresses, there are growing conc Read more…

US Implements Controls on Quantum Computing and other Technologies

September 27, 2024

Yesterday the Commerce Department announced export controls on quantum computing technologies as well as new controls for advanced semiconductors and additive Read more…

Google’s DataGemma Tackles AI Hallucination

September 18, 2024

The rapid evolution of large language models (LLMs) has fueled significant advancement in AI, enabling these systems to analyze text, generate summaries, sugges Read more…

Microsoft, Quantinuum Use Hybrid Workflow to Simulate Catalyst

September 13, 2024

Microsoft and Quantinuum reported the ability to create 12 logical qubits on Quantinuum's H2 trapped ion system this week and also reported using two logical qu Read more…

On Paper, AMD’s New MI355X Makes MI325X Look Pedestrian

October 15, 2024

Advanced Micro Devices has detailed two new GPUs that unambiguously reinforce it as the only legitimate GPU alternative to Nvidia. AMD shared new facts on its n Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire