DOE Sets Sights on Accelerating AI (and other) Technology Transfer

By John Russell

October 3, 2019

For the past two days DOE leaders along with ~350 members from academia and industry gathered in Chicago to discuss AI development and the ways in which industry and DOE could collaborate to foster AI commercialization. The occasion was DOE’s fourth InnovationXLab Summit – the earlier three summits were on advanced manufacturing; grid modernization; and energy storage. Two more are planned: bio-manufacturing in January and quantum science in the late spring.

On the agenda was AI technology and policy. While discussion rarely got into technology weeds, it was nonetheless a fascinating glimpse into AI prospects and challenges as seen by this experienced group and meant to stimulate a broader conversation between industry and DOE. The current Administration clearly shares the view that AI could well become transformative, a point made by Secretary of Energy Rick Perry in his opening remarks and further demonstrated last month by DOE’s formal establishment of an Artificial Intelligence and Technology Office (AITO).

The latest InnovationXLab also seems indicative of the Administration’s desire to drive greater technology transfer from the national labs. (Did you know DOE has its first chief commercialization officer, Conner Prochaska, appointed in 2018 as CCO and director of the DOE Office of Technology Transitions?) Back in July DOE launched its Lab Partnering Service (LPS) – a web portal intended to showcase DOE capabilities and facilitate match-making between potential users and DOE resources.

Broadly the LPS has three parts:

  • Connect with Experts. Unprecedented access to top national lab researchers will allow investors and innovators to connect with relevant subject matter experts, and receive unbiased and non-competitive technical assessments.
  • Technical/Marketing Summaries. Direct access to pre-validated, ready to license, and commercialize technologies.
  • Visual Patent Search. Dynamic online search and visualization database tool for patents associated with DOE laboratories.

Robert Bectel, senior program analyst, for DOE’s Office of Technology Transitions gave a brief talk on LPS capabilities.

“DOE has a lot of places for you to engage,” noted Bectel. “You can gain entry into the into the National Lab infrastructure to use these facilities. You typically do not rent the whole building, you rent a part of the building, [and] when you’re in that building, you’re not necessarily the expert using machinery, you’ve designed the experiment, the experts come from labs. So you are in essence renting a team [or] a tool inside of a facility.

“We have 1,219 technology summaries online. Those are technologies that the national labs have provided to us for purposes of saying these are ready, these are technologies we think are ready to go to market. And here’s how you can connect with us to use them and pull them out quickly. The last thing, of course, is patents. Well, we have about 37,977 patents online where the last 20 years,” Bectel said.

Much was rightly made of DOE’s formidable HPC (four of the top ten fastest computers worldwide) and experimental resources as well as its patent and expertise resources. Driving technology transfer has always been challenging and it seems clear that DOE is looking for ways to become better at it.

Before digging into some of the AI-specific content, it’s perhaps worth quoting Rick Stevens, associate director, Argonne National Laboratory, who played a role as host and panel participant. Panel moderator Dimitri Kusnezov, DOE’s Deputy Under Secretary for Artificial Intelligence and Technology, asked Stevens, “From the point of view of the labs and novel partnerships what is the message from national labs?”

Stevens said, “The first message is we’re kind of open for business. I mean, that’s the first thing. Second thing is that we’re hungry for interesting problems that are going to challenge people. One way you get National Lab scientists motivated isn’t by throwing money at them necessarily – that helps of course – but giving [problems] that they find so compelling that they’re going to stay up all night and work on it over and over again…I’m sure in this room there are people from industry that have some of those problems that keep them up at night, whether it’s cybersecurity or whether it’s grid stability, or whether it’s drug development.”

Tracking DOE’s tech transfer efforts moving forward, particularly given potential game-changers such as AI (near-term) and quantum information science (long-term), will be interesting.

Source: IBM rendition of HPC and AI development timelines

Capturing the full scope of the conference is beyond this brief article. Conference organizers have posted links to portions that were livestreamed (mostly plenary sessions). Presented here are summaries of the first panel (AI Research and DOE National Labs, National AI Initiative, Opportunities for Industry Collaborations in AI), which examined top line opportunities and challenges for AI medicine, energy, crypto-security, and academia.

Panelists included:

  • John Baldoni, CTO of Integral Health, a start-up seeking to use AI in drug discovery and proof of pharmacology. He was also a longtime senior research executive at GlaxoSmithKline.
  • Theresa Christian, manager, corporate strategy, innovation and sustainability, for Exelon, a large power generation and transmission company serving the U.S.
  • Ray Rothrock, chairman and CEO, Red Seal, a cyber-security specialist, which “models entire hybrid data center of public cloud, private cloud, and physical networks.”
  • Stevens, associate lab director at ANL and leader of Argonne’s Exascale Computing Initiative.

To some extent their comments were familiar to AI watchers. The flood of data and diversity of data types in virtually all domains is creating the opportunity for AI methods to create actionable insight. In biomedicine, for example, a vast amount of data is locked up inside pharma. With easier access to this data AI approaches can find association and potential therapies humans and traditional analytics have trouble doing.

Baldoni helped spearhead an effort involving GlaxoSmithKline (GSK), Lawrence Livermore Laboratory, the National Cancer Institute and others to build a central repository. “The consortium we put together is called ATOM– accelerating therapeutic opportunities in medicine,” he said. “They did an experiment using that dark data about a year after it was curated to design a molecule for a specific cancer target that’s of interest in the industry. They used simulations and optimized that molecule in a multi-parameter optimization in a compute run against toxicology, [bio] distribution, efficacy, the ability to synthesize the molecule and the ability of the molecule to distribute in tissue. It took 16 hours to come up with 200 virtual molecules that weren’t in the training set.”

Researchers found promising molecule which is “actually in phase two clinical trials for that cancer target. On average, to get a molecule that would make it to phase two, the industry tests of millions of compounds, synthesis of 10s of thousands of compounds, and takes about five years. So, the computer run took 16 hours.” Unlocking the dark matter in this instance was the big key.

Conversely the flood of data sloshing about in the world also represents vulnerabilities. Rothrock knows this world well.

Theresa Christian (Exelon), Ray Rothrock (Red Seal), and Rick Stevens (ANL)

“Look, AI empowers the bad guys, but it really empowers the good guys. It’s about threat evaluation and it is about defense evaluation. [For example], the infamous video of President Obama giving a speech which he never gave. It’s simply terrifying that you can create, through data and compute, a deceptive speech. I think you can flip that same coin to the other side,” said Rothrock.

“I am a person. I have certain behaviors and habits and so forth. And if my actions are tested, when a decision is made as a leader, a political leader, financial leader, if that decision could be taken through an AI filter to see how far off it would from what I might be expected to decide and if it’s way out there, then you raise a flag and ask, “So why is he making that decision? What’s driving that decision? Having enough data is again key, and, of course, also subject to misuse.”

One challenge Exelon deals with is managing a wide variety of data modalities in real-time.

“So one of our key responsibilities as an ally for giving is to keep the lights on, right. If there’s a winter storm in the city where you live, and that might cause limbs to fall on wires, we need to send crews out to restore the service by repairing those wires and get the power back on as quickly, efficiently and safely as possible. To do that relies on a number of different data streams that we maintain in house for that purpose, including, you know, the type of data that represents the assets we have in the field,” said Christian.

“We need to know everything about the geospatial distribution of our system, but also about the operational crew requirements, and what we can say about the weather that’s coming so we can better understand what the impact is likely to be. So when we think about the opportunity space for AI in an operational challenge like that, one of the really interesting areas for us is to build analytics frameworks that allow for these multiple modalities of data to be combined together so that we can develop solutions that use all those streams in a coordinated way and solve the problem at hand,” she said.

All the panelists commented on workforce issues. There was general agreement that AI is developed most effectively in multi-discipline environments.

“The cyber industry is about a $126 billion [market]. There are 3,000 products out there. A typical large corporation probably like Exelon has 50 or 60 cyber products and only five or 10 people to operate it. Well, that number, it’s a crushing situation. And while you need engineers, for sure, you also need technicians. They don’t need all need a four-year degree, they need a piece of it,” said Rothrock.

Christian said, “We expect that there’s going to be people who are getting trained in other places coming in with the type of expertise to run these systems, whether it’s in the technician role, or the more senior expert roles. But we also are being very intentional about bringing our internal workforce up to speed with the types of skills that they’re going to need in data driven businesses. We’re doing a lot of internal workforce training through some of our analytics initiative internally at Exelon, and we’re finding ways to be more collaborative between our in-house experts and the external community.”

Stay tuned.

Link to conference videos: https://livestream.com/argonnelive

Links to Previous InnovationXLab Summits:

Advanced Manufacturing (May 7-8, 2019)

Oak Ridge, Tennessee (hosted by Oak Ridge National Laboratory)

Grid Modernization (January 24-25, 2019)

Seattle, Washington (hosted by Pacific Northwest National Laboratory)

Energy Storage (September 18-19, 2018)

Menlo Park, California (hosted by SLAC National Accelerator Laboratory)

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Harvard/Google Use AI to Help Produce Astonishing 3D Map of Brain Tissue

May 10, 2024

Although LLMs are getting all the notice lately, AI techniques of many varieties are being infused throughout science. For example, Harvard researchers, Google, and colleagues published a 3D map in Science this week that Read more…

ISC Preview: Focus Will Be on Top500 and HPC Diversity 

May 9, 2024

Last year's Supercomputing 2023 in November had record attendance, but the direction of high-performance computing was a hot topic on the floor. Expect more of that at the upcoming ISC High Performance 2024, which is hap Read more…

Processor Security: Taking the Wong Path

May 9, 2024

More research at UC San Diego revealed yet another side-channel attack on x86_64 processors. The research identified a new vulnerability that allows precise control of conditional branch prediction in modern processors.� Read more…

The Ultimate 2024 Winter Class Round-Up

May 8, 2024

To make navigating easier, we have compiled a collection of all the 2024 Winter Classic News in this single page round-up. Meet The Teams   Introducing Team Lobo This is the other team from University of New Mex Read more…

How the Chip Industry is Helping a Battery Company

May 8, 2024

Chip companies, once seen as engineering pure plays, are now at the center of geopolitical intrigue. Chip manufacturing firms, especially TSMC and Intel, have become the backbone of devices with an on/off switch. Thes Read more…

Illinois Considers $20 Billion Quantum Manhattan Project Says Report

May 7, 2024

There are multiple reports that Illinois governor Jay Robert Pritzker is considering a $20 billion Quantum Manhattan-like project for the Chicago area. According to the reports, photonics quantum computer developer PsiQu Read more…

ISC Preview: Focus Will Be on Top500 and HPC Diversity 

May 9, 2024

Last year's Supercomputing 2023 in November had record attendance, but the direction of high-performance computing was a hot topic on the floor. Expect more of Read more…

Illinois Considers $20 Billion Quantum Manhattan Project Says Report

May 7, 2024

There are multiple reports that Illinois governor Jay Robert Pritzker is considering a $20 billion Quantum Manhattan-like project for the Chicago area. Accordin Read more…

The NASA Black Hole Plunge

May 7, 2024

We have all thought about it. No one has done it, but now, thanks to HPC, we see what it looks like. Hold on to your feet because NASA has released videos of wh Read more…

How Nvidia Could Use $700M Run.ai Acquisition for AI Consumption

May 6, 2024

Nvidia is touching $2 trillion in market cap purely on the brute force of its GPU sales, and there's room for the company to grow with software. The company hop Read more…

Hyperion To Provide a Peek at Storage, File System Usage with Global Site Survey

May 3, 2024

Curious how the market for distributed file systems, interconnects, and high-end storage is playing out in 2024? Then you might be interested in the market anal Read more…

Qubit Watch: Intel Process, IBM’s Heron, APS March Meeting, PsiQuantum Platform, QED-C on Logistics, FS Comparison

May 1, 2024

Intel has long argued that leveraging its semiconductor manufacturing prowess and use of quantum dot qubits will help Intel emerge as a leader in the race to de Read more…

Stanford HAI AI Index Report: Science and Medicine

April 29, 2024

While AI tools are incredibly useful in a variety of industries, they truly shine when applied to solving problems in scientific and medical discovery. Research Read more…

IBM Delivers Qiskit 1.0 and Best Practices for Transitioning to It

April 29, 2024

After spending much of its December Quantum Summit discussing forthcoming quantum software development kit Qiskit 1.0 — the first full version — IBM quietly Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Leading Solution Providers

Contributors

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

Intel Plans Falcon Shores 2 GPU Supercomputing Chip for 2026  

August 8, 2023

Intel is planning to onboard a new version of the Falcon Shores chip in 2026, which is code-named Falcon Shores 2. The new product was announced by CEO Pat Gel Read more…

The NASA Black Hole Plunge

May 7, 2024

We have all thought about it. No one has done it, but now, thanks to HPC, we see what it looks like. Hold on to your feet because NASA has released videos of wh Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire