DOE Sets Sights on Accelerating AI (and other) Technology Transfer

By John Russell

October 3, 2019

For the past two days DOE leaders along with ~350 members from academia and industry gathered in Chicago to discuss AI development and the ways in which industry and DOE could collaborate to foster AI commercialization. The occasion was DOE’s fourth InnovationXLab Summit – the earlier three summits were on advanced manufacturing; grid modernization; and energy storage. Two more are planned: bio-manufacturing in January and quantum science in the late spring.

On the agenda was AI technology and policy. While discussion rarely got into technology weeds, it was nonetheless a fascinating glimpse into AI prospects and challenges as seen by this experienced group and meant to stimulate a broader conversation between industry and DOE. The current Administration clearly shares the view that AI could well become transformative, a point made by Secretary of Energy Rick Perry in his opening remarks and further demonstrated last month by DOE’s formal establishment of an Artificial Intelligence and Technology Office (AITO).

The latest InnovationXLab also seems indicative of the Administration’s desire to drive greater technology transfer from the national labs. (Did you know DOE has its first chief commercialization officer, Conner Prochaska, appointed in 2018 as CCO and director of the DOE Office of Technology Transitions?) Back in July DOE launched its Lab Partnering Service (LPS) – a web portal intended to showcase DOE capabilities and facilitate match-making between potential users and DOE resources.

Broadly the LPS has three parts:

  • Connect with Experts. Unprecedented access to top national lab researchers will allow investors and innovators to connect with relevant subject matter experts, and receive unbiased and non-competitive technical assessments.
  • Technical/Marketing Summaries. Direct access to pre-validated, ready to license, and commercialize technologies.
  • Visual Patent Search. Dynamic online search and visualization database tool for patents associated with DOE laboratories.

Robert Bectel, senior program analyst, for DOE’s Office of Technology Transitions gave a brief talk on LPS capabilities.

“DOE has a lot of places for you to engage,” noted Bectel. “You can gain entry into the into the National Lab infrastructure to use these facilities. You typically do not rent the whole building, you rent a part of the building, [and] when you’re in that building, you’re not necessarily the expert using machinery, you’ve designed the experiment, the experts come from labs. So you are in essence renting a team [or] a tool inside of a facility.

“We have 1,219 technology summaries online. Those are technologies that the national labs have provided to us for purposes of saying these are ready, these are technologies we think are ready to go to market. And here’s how you can connect with us to use them and pull them out quickly. The last thing, of course, is patents. Well, we have about 37,977 patents online where the last 20 years,” Bectel said.

Much was rightly made of DOE’s formidable HPC (four of the top ten fastest computers worldwide) and experimental resources as well as its patent and expertise resources. Driving technology transfer has always been challenging and it seems clear that DOE is looking for ways to become better at it.

Before digging into some of the AI-specific content, it’s perhaps worth quoting Rick Stevens, associate director, Argonne National Laboratory, who played a role as host and panel participant. Panel moderator Dimitri Kusnezov, DOE’s Deputy Under Secretary for Artificial Intelligence and Technology, asked Stevens, “From the point of view of the labs and novel partnerships what is the message from national labs?”

Stevens said, “The first message is we’re kind of open for business. I mean, that’s the first thing. Second thing is that we’re hungry for interesting problems that are going to challenge people. One way you get National Lab scientists motivated isn’t by throwing money at them necessarily – that helps of course – but giving [problems] that they find so compelling that they’re going to stay up all night and work on it over and over again…I’m sure in this room there are people from industry that have some of those problems that keep them up at night, whether it’s cybersecurity or whether it’s grid stability, or whether it’s drug development.”

Tracking DOE’s tech transfer efforts moving forward, particularly given potential game-changers such as AI (near-term) and quantum information science (long-term), will be interesting.

Source: IBM rendition of HPC and AI development timelines

Capturing the full scope of the conference is beyond this brief article. Conference organizers have posted links to portions that were livestreamed (mostly plenary sessions). Presented here are summaries of the first panel (AI Research and DOE National Labs, National AI Initiative, Opportunities for Industry Collaborations in AI), which examined top line opportunities and challenges for AI medicine, energy, crypto-security, and academia.

Panelists included:

  • John Baldoni, CTO of Integral Health, a start-up seeking to use AI in drug discovery and proof of pharmacology. He was also a longtime senior research executive at GlaxoSmithKline.
  • Theresa Christian, manager, corporate strategy, innovation and sustainability, for Exelon, a large power generation and transmission company serving the U.S.
  • Ray Rothrock, chairman and CEO, Red Seal, a cyber-security specialist, which “models entire hybrid data center of public cloud, private cloud, and physical networks.”
  • Stevens, associate lab director at ANL and leader of Argonne’s Exascale Computing Initiative.

To some extent their comments were familiar to AI watchers. The flood of data and diversity of data types in virtually all domains is creating the opportunity for AI methods to create actionable insight. In biomedicine, for example, a vast amount of data is locked up inside pharma. With easier access to this data AI approaches can find association and potential therapies humans and traditional analytics have trouble doing.

Baldoni helped spearhead an effort involving GlaxoSmithKline (GSK), Lawrence Livermore Laboratory, the National Cancer Institute and others to build a central repository. “The consortium we put together is called ATOM– accelerating therapeutic opportunities in medicine,” he said. “They did an experiment using that dark data about a year after it was curated to design a molecule for a specific cancer target that’s of interest in the industry. They used simulations and optimized that molecule in a multi-parameter optimization in a compute run against toxicology, [bio] distribution, efficacy, the ability to synthesize the molecule and the ability of the molecule to distribute in tissue. It took 16 hours to come up with 200 virtual molecules that weren’t in the training set.”

Researchers found promising molecule which is “actually in phase two clinical trials for that cancer target. On average, to get a molecule that would make it to phase two, the industry tests of millions of compounds, synthesis of 10s of thousands of compounds, and takes about five years. So, the computer run took 16 hours.” Unlocking the dark matter in this instance was the big key.

Conversely the flood of data sloshing about in the world also represents vulnerabilities. Rothrock knows this world well.

Theresa Christian (Exelon), Ray Rothrock (Red Seal), and Rick Stevens (ANL)

“Look, AI empowers the bad guys, but it really empowers the good guys. It’s about threat evaluation and it is about defense evaluation. [For example], the infamous video of President Obama giving a speech which he never gave. It’s simply terrifying that you can create, through data and compute, a deceptive speech. I think you can flip that same coin to the other side,” said Rothrock.

“I am a person. I have certain behaviors and habits and so forth. And if my actions are tested, when a decision is made as a leader, a political leader, financial leader, if that decision could be taken through an AI filter to see how far off it would from what I might be expected to decide and if it’s way out there, then you raise a flag and ask, “So why is he making that decision? What’s driving that decision? Having enough data is again key, and, of course, also subject to misuse.”

One challenge Exelon deals with is managing a wide variety of data modalities in real-time.

“So one of our key responsibilities as an ally for giving is to keep the lights on, right. If there’s a winter storm in the city where you live, and that might cause limbs to fall on wires, we need to send crews out to restore the service by repairing those wires and get the power back on as quickly, efficiently and safely as possible. To do that relies on a number of different data streams that we maintain in house for that purpose, including, you know, the type of data that represents the assets we have in the field,” said Christian.

“We need to know everything about the geospatial distribution of our system, but also about the operational crew requirements, and what we can say about the weather that’s coming so we can better understand what the impact is likely to be. So when we think about the opportunity space for AI in an operational challenge like that, one of the really interesting areas for us is to build analytics frameworks that allow for these multiple modalities of data to be combined together so that we can develop solutions that use all those streams in a coordinated way and solve the problem at hand,” she said.

All the panelists commented on workforce issues. There was general agreement that AI is developed most effectively in multi-discipline environments.

“The cyber industry is about a $126 billion [market]. There are 3,000 products out there. A typical large corporation probably like Exelon has 50 or 60 cyber products and only five or 10 people to operate it. Well, that number, it’s a crushing situation. And while you need engineers, for sure, you also need technicians. They don’t need all need a four-year degree, they need a piece of it,” said Rothrock.

Christian said, “We expect that there’s going to be people who are getting trained in other places coming in with the type of expertise to run these systems, whether it’s in the technician role, or the more senior expert roles. But we also are being very intentional about bringing our internal workforce up to speed with the types of skills that they’re going to need in data driven businesses. We’re doing a lot of internal workforce training through some of our analytics initiative internally at Exelon, and we’re finding ways to be more collaborative between our in-house experts and the external community.”

Stay tuned.

Link to conference videos: https://livestream.com/argonnelive

Links to Previous InnovationXLab Summits:

Advanced Manufacturing (May 7-8, 2019)

Oak Ridge, Tennessee (hosted by Oak Ridge National Laboratory)

Grid Modernization (January 24-25, 2019)

Seattle, Washington (hosted by Pacific Northwest National Laboratory)

Energy Storage (September 18-19, 2018)

Menlo Park, California (hosted by SLAC National Accelerator Laboratory)

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

DARPA Looks to Automate Secure Silicon Designs

May 28, 2020

The U.S. military is ramping up efforts to secure semiconductors and its electronics supply chain by embedding defenses during the chip design phase. The automation effort also addresses the high cost and complexity of s Read more…

By George Leopold

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI-based techniques – has expanded to more than 56 research Read more…

By Doug Black

What’s New in Computing vs. COVID-19: IceCube, TACC, Watson & More

May 28, 2020

Supercomputing, big data and artificial intelligence are crucial tools in the fight against the coronavirus pandemic. Around the world, researchers, corporations and governments are urgently devoting their computing reso Read more…

By Oliver Peckham

Supercomputer Simulations Explain the Asteroid that Killed the Dinosaurs

May 28, 2020

The supercomputing community has cataclysms on the mind. Hot on the heels of supercomputer-powered research delving into the fate of the neanderthals, a team of researchers used supercomputers at the DiRAC (Distributed R Read more…

By Oliver Peckham

House Bill Seeks Study on Quantum Computing, Identifying Benefits, Supply Chain Risks

May 27, 2020

New legislation under consideration (H.R.6919, Advancing Quantum Computing Act) requests that the Secretary of Commerce conduct a comprehensive study on quantum computing to assess the benefits of the technology for American competitiveness as well as identify supply chain risks. Read more…

By Tiffany Trader

AWS Solution Channel

Computational Fluid Dynamics on AWS

Over the past 30 years Computational Fluid Dynamics (CFD) has grown to become a key part of many engineering design processes. From aircraft design to modelling the blood flow in our bodies, the ability to understand the behaviour of fluids has enabled countless innovations and improved the time to market for many products. Read more…

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to have bipartisan support, calls for giving NSF $100 billion Read more…

By John Russell

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI Read more…

By Doug Black

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

IBM Boosts Deep Learning Accuracy on Memristive Chips

May 27, 2020

IBM researchers have taken another step towards making in-memory computing based on phase change (PCM) memory devices a reality. Papers in Nature and Frontiers Read more…

By John Russell

Hats Over Hearts: Remembering Rich Brueckner

May 26, 2020

HPCwire and all of the Tabor Communications family are saddened by last week’s passing of Rich Brueckner. He was the ever-optimistic man in the Red Hat presiding over the InsideHPC media portfolio for the past decade and a constant presence at HPC’s most important events. Read more…

Nvidia Q1 Earnings Top Expectations, Datacenter Revenue Breaks $1B

May 22, 2020

Nvidia’s seemingly endless roll continued in the first quarter with the company announcing blockbuster earnings that exceeded Wall Street expectations. Nvidia Read more…

By Doug Black

Microsoft’s Massive AI Supercomputer on Azure: 285k CPU Cores, 10k GPUs

May 20, 2020

Microsoft has unveiled a supercomputing monster – among the world’s five most powerful, according to the company – aimed at what is known in scientific an Read more…

By Doug Black

HPC in Life Sciences 2020 Part 1: Rise of AMD, Data Management’s Wild West, More 

May 20, 2020

Given the disruption caused by the COVID-19 pandemic and the massive enlistment of major HPC resources to fight the pandemic, it is especially appropriate to re Read more…

By John Russell

AMD Epyc Rome Picked for New Nvidia DGX, but HGX Preserves Intel Option

May 19, 2020

AMD continues to make inroads into the datacenter with its second-generation Epyc "Rome" processor, which last week scored a win with Nvidia's announcement that Read more…

By Tiffany Trader

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Contributors

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This