ORNL Develops, Deploys AI Capabilities Across Research Portfolio

September 24, 2019

OAK RIDGE, Tenn., September 24, 2019—Processes like manufacturing aircraft parts, analyzing data from doctors’ notes and identifying national security threats may seem unrelated, but at the U.S. Department of Energy’s Oak Ridge National Laboratory, artificial intelligence is improving all of these tasks. To accelerate promising AI applications in diverse research fields, ORNL has established a labwide AI Initiative, and its success will help to ensure U.S. economic competitiveness and national security.

Led by ORNL AI Program Director David Womble, this internal investment brings the lab’s AI expertise, computing resources and user facilities together to facilitate analyses of massive datasets that would otherwise be unmanageable. Multidisciplinary research teams are advancing AI and high-performance computing to tackle increasingly complex problems, including designing novel materials, diagnosing and treating diseases and enhancing the cybersecurity of U.S. infrastructure.

“AI has the potential to revolutionize science and engineering, and it is exciting to be part of this,” Womble said. “With its world-class scientists and facilities, ORNL will make significant contributions.”

Across the lab, experts in data science are applying AI tools known as machine learning algorithms (which allow computers to learn from data and predict outcomes) and deep learning algorithms (which use neural networks inspired by the human brain to uncover patterns of interest in datasets) to accelerate breakthroughs across the scientific spectrum. As part of the initiative, ORNL researchers are developing new technologies to complement and expand these capabilities, establishing AI as a force for improving both fundamental and applied science applications.

Home to the world’s most powerful and smartest supercomputer, Summit, ORNL is particularly well-suited for AI research. The IBM system debuted in June 2018 and resides at the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility located at ORNL. With hardware optimized for AI applications, Summit provides an ideal platform for applying machine learning and deep learning to groundbreaking research. The system’s increased memory bandwidth allows AI algorithms to run at faster speeds and obtain more accurate results.

Other AI-enabled machines include the NVIDIA DGX-2 systems located at ORNL’s Compute and Data Environment for Science. These appliances allow researchers to tackle data-intensive problems using unique AI strategies and to run smaller-scale simulations in preparation for later work on Summit.

“AI is rapidly changing the way computational scientists do research, and ORNL’s history of leadership in computing and data makes it the perfect setting in which to advance the state of the art,” said Associate Laboratory Director for Computing and Computational Sciences Jeff Nichols. “While Summit’s rapid training of AI networks is already assisting researchers across the scientific spectrum in realizing the potential of AI, we have begun preparing for the post-Summit world via Frontier, a second-generation AI system that will provide new capabilities for machine learning, deep learning and data analytics.”

Although ORNL researchers are applying the lab’s unique combination of AI expertise and powerful computing resources to address a range of scientific challenges, three areas in particular are poised to deliver major early results: additive manufacturing, health care and cyber-physical security.

Additive manufacturing, or 3D printing, enables researchers at the Manufacturing Demonstration Facility, a DOE Office of Energy Efficiency and Renewable Energy User Facility located at ORNL, to develop reliable, energy-efficient plastic and metal parts at low cost. Using AI, they can consistently create high-quality, specialized aerospace components. AI can instantly locate cracks and other defects before they become problems, thereby reducing costs and time to market.

 

Additionally, AI makes it possible for the machines to detect and repair errors in real time during the process of binder jetting, in which a liquid binding agent fuses together layers of powder particles.

Researchers at ORNL are also optimizing AI techniques to analyze patient data from medical tests, doctors’ notes and other health records. These techniques use language processing to identify patterns among notes from different doctors, extracting previously inaccessible insights from mountains of data. When combined with results from x-rays and other relevant tests, these results could improve health care providers’ ability to diagnose and treat problems ranging from post-traumatic stress disorder to cancer.

For example, ORNL Health Data Sciences Institute Director Gina Tourassi uses AI to automatically compile and analyze data and determine which factors are responsible for the development of certain diseases. Her team is running machine learning algorithms on Summit to scan millions of medical documents in pursuit of these types of insights.

Cybersecurity platforms such as “Situ” monitor thousands of events per second to detect anomalies that human analysts would not be able to find. Situ sorts through massive amounts of raw network data, freeing up network operators to focus on small, manageable amounts of activity to investigate potential threats and make more informed decisions.

And through partnerships with power companies, ORNL has also used AI to improve the security of power grids by monitoring data streams and identifying suspicious activity.

To date, ORNL researchers have earned two R&D 100 Awards and 10 patents for work related to AI research and algorithm development. The lab plans to recruit additional AI experts to continue building on this foundation.

To ensure that U.S. researchers maintain leadership in R&D innovation and continue revolutionizing science with AI, ORNL also provides professional development opportunities including the Artificial Intelligence Summer Institute, which pairs students with ORNL researchers to solve science problems using AI, and the Data Learning Users Group, which allows OLCF users and ORNL staff to practice using deep learning techniques.

ORNL also collaborates with the University of Tennessee, Knoxville, to support the Bredesen Center Ph.D. program in data science and engineering, a curriculum that combines data science with scientific specialties ranging from materials science to national security.

UT-Battelle LLC manages Oak Ridge National Laboratory for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.


Source: Elizabeth Rosenthal, ORNL

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Quantinuum Reports 99.9% 2-Qubit Gate Fidelity, Caps Eventful 2 Months

April 16, 2024

March and April have been good months for Quantinuum, which today released a blog announcing the ion trap quantum computer specialist has achieved a 99.9% (three nines) two-qubit gate fidelity on its H1 system. The lates Read more…

Mystery Solved: Intel’s Former HPC Chief Now Running Software Engineering Group 

April 15, 2024

Last year, Jeff McVeigh, Intel's readily available leader of the high-performance computing group, suddenly went silent, with no interviews granted or appearances at press conferences.  It led to questions -- what's Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Institute for Human-Centered AI (HAI) put out a yearly report to t Read more…

Crossing the Quantum Threshold: The Path to 10,000 Qubits

April 15, 2024

Editor’s Note: Why do qubit count and quality matter? What’s the difference between physical qubits and logical qubits? Quantum computer vendors toss these terms and numbers around as indicators of the strengths of t Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips are available off the shelf, a concern raised at many recent Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announced its second fund targeting €200 million. The very idea th Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Computational Chemistry Needs To Be Sustainable, Too

April 8, 2024

A diverse group of computational chemists is encouraging the research community to embrace a sustainable software ecosystem. That's the message behind a recent Read more…

Hyperion Research: Eleven HPC Predictions for 2024

April 4, 2024

HPCwire is happy to announce a new series with Hyperion Research  - a fact-based market research firm focusing on the HPC market. In addition to providing mark Read more…

Google Making Major Changes in AI Operations to Pull in Cash from Gemini

April 4, 2024

Over the last week, Google has made some under-the-radar changes, including appointing a new leader for AI development, which suggests the company is taking its Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

Leading Solution Providers

Contributors

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

Intel’s Xeon General Manager Talks about Server Chips 

January 2, 2024

Intel is talking data-center growth and is done digging graves for its dead enterprise products, including GPUs, storage, and networking products, which fell to Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire