Quantum Centers, National Labs Tackle Workforce and Public-Private Partnering

By John Russell

April 28, 2023

This year, the U.S. National Quantum Initiative Act (NQIA), passed in 2018, is up for re-authorization by Congress. NQIA is a complicated $2B-plus effort and one of its centerpieces was the creation of five QIS research centers, based at Department of Energy national laboratories. The centers are charged with advancing QIS (quantum information science) research, collaborating with industry, and helping to develop the workforce required to sustain QIS writ large.

If you’re a quantum watcher, you know there are similar efforts around the world. Just this year, the U.K issued its own £2.5B, 10-year national quantum strategy. China, the EU, Japan, and others have all joined the quantum race. So: how are we doing? At the Quantum.tech23 conference held in Boston this week, there was a fascinating panel with senior representatives from three U.S. QIS centers and a leader of the UK’s growing initiative.

The panel, moderated by Bob Sorensen of Hyperion Research, focused mainly on three topics: workforce development, private-public collaboration and government procurements. Panelists included (plus center descriptions):

  • Anna Grassellino, director of the Superconducting Quantum Materials and Systems Center (SQMS), Fermilab. Description: “The primary mission of SQMS is to achieve transformational advances in the major crosscutting challenge of understanding and eliminating the decoherence mechanisms in superconducting 2D and 3D devices, with the goal of enabling construction and deployment of superior quantum systems for computing and sensing. In addition to the scientific advances, SQMS will target tangible deliverables in the form of unique foundry capabilities and quantum testbeds for materials, physics, algorithms, and simulations that could broadly serve the national QIS ecosystem.”
  • Kimberly McGuire, chief operations officer, Co-design Center for Quantum Advantage (C2QA), Brookhaven National Laboratory. Description: “C2QA aims to overcome the limitations of today’s noisy intermediate scale quantum (NISQ) computer systems to achieve quantum advantage for scientific computations in high-energy, nuclear, chemical and condensed matter physics. The integrated five-year goal of C2QA is to deliver a factor of 10 improvement in each of software optimization, underlying materials and device properties, and quantum error correction, and to ensure these improvements combine to provide a factor of 1,000 improvement in appropriate computation metrics.”
  • Travis Humble, Director, Quantum Science Center (QSC), Oak Ridge National Laboratory. Description: “QSC is dedicated to overcoming key roadblocks in quantum state resilience, controllability, and ultimately scalability of quantum technologies. This goal will be achieved through integration of the discovery, design, and demonstration of revolutionary topological quantum materials, algorithms, and sensors, catalyzing development of disruptive technologies. In addition to the scientific goals, integral to the activities of the QSC are development of the next generation of QIS workforce by creating a rich environment for professional development and close coordination with industry to transition new QIS applications to the private sector.”
  • Sir Peter Knight

    Sir Peter Knight, technical advisor, IUK Quantum Technology Challenge and Chair of the UK Quantum Technology Strategic Advisory Board. National Quantum Strategy description: “A 10-year vision and actions for the UK to be a leading quantum-enabled economy, recognising the importance of quantum technologies for the UK’s prosperity and security.”

No one really knows when QIS research will transition into a vibrant industry, but the question seems to have shifted from “if” to “when.” Indeed, much of the conference tackled this broader question. For the purposes of this panel, progress has actually been rather dramatic. The NQIS centers have been up and running for about three years and have, in one way or another, involved “1,200 experts, including 600 students and postdocs, across 80 academic, industry, national lab, and other national science institutions”[i]. Just over three years ago they had barely been defined.

Presented here are a few of the panelists’ comments (lightly edited) on each of the primary topic.

Workforce Building: What progress has been made, and what’s needed?

Everyone agrees there aren’t nearly enough quantum-skilled workers – and not just quantum physicists, but also the many engineering and technician-level adjunct skills: for example cryogenic expertise to run/maintain the big dilution refrigerators need for some types of qubits. One piece of evidence is the job listing section of a weekly newsletter assembled by Humble; its latest issue had ~13 pages of openings listed.

Sorensen asked the panel, “What are the labs doing, not only to build a new workforce, but also to take advantage of the existing skill set and bring the right people into the fold in terms of you’re doing this now? And can we transfer a number of these technologies and your knowledge and experience to retrain people for lateral movement?”

Travis Humble, ORNL

Humble kicked off the conversation. “My own background was to start off in theoretical chemistry, actually. So over 18 years ago, I made the transition into quantum information as a topic. What I have found is that my transition really is what drove my ability to learn and acquire new skills but that skill set was more of a toolbox. Today, what we’re finding is that ‘creating’ people who have access to that type of toolbox is what’s essential to enable the technologies that we’re creating at QSC.”

“I’ll give you an example. One of the key things that we’re focused on [is] creating new types of materials that can be used for quantum computing platforms, specifically called topological quantum materials. That requires a certain level of expertise in material science and instrumentation and measurement that you can actually recruit out of conventional material science programs. But there’s this added emphasis on the thermodynamic conditions under which those materials need to be looked at very cold temperatures, very low vacuums, shielding and very low electromagnetic noise. At the moment, there are no well-worn paths into the field of quantum technology. That’s what we are creating,” he said.

Grasselino’s background is as a scientist at Fermilab working with particle accelerators and SQMS is leveraging Fermilab expertise in the near-term. “Part of why we came into the field is that we view the center (SQMS) around this unique expertise that we have developed to build gigantic particle accelerators with the coherence of the order of seconds,” she said. “We thought that we had something to bring to the table in terms of expertise. SQMS has leveraged Fermilab’s expertise in growing itself and recruiting its initial team.” These team members already have expertise in things like superconducting technologies and shielding technologies –  “There’s just so much that really overlaps.” – and can learn the quantum skills.

Long-term, more education and more outreach is needed. “This summer we’ll be hosting the first five-NQIS-center joint summer school. That will not be just yet another summer school where people can learn about Quantum GIS. It will be really hands on. We will put all facilities that we have at national labs with material science, material science tools, superconducting cavities, qubits, and students come and learn hands on about this technology. I think more of this needs to be done.”

C2QA also has an outreach effort.

Kimberly McGuire, Co-design Center for Quantum Advantage, BNL

“What C2QA is doing and has been doing for the last two years is to host a quantum information science career fair. That allows us to meet with industry, government, academia, students, faculty and job seekers to really share an update on the market,” said McGuire. “What is the scientific progress that has been made, where’s the supply and demand, where’s the need coming from? Do we need postdocs, do we need graduate students on their faculty who are not necessarily in quantum now but are in interested in pivoting so that they can contribute to solving some of these QIS challenges? This virtual career fair has hosted over 1,500 participants, we have had over 27 exhibitors. This year’s program, we plan to double in size.”

It was interesting to hear the UK perspective from Knight.

“We rather overly concentrate, I think, on producing PhDs in quantum, [but] it’s really important to look at the entire spread of skills needed for this to be a genuine technology that creates the industry and employment. That means that we’ve got to do something about, for example, technicians. One of the things that we have done through our national labs in the UK is to build an apprentice academy. For example, the lead one is our National Physical Laboratory (NPL) – the UK version of NIST, if you like – where we are working on the throughput of apprentices, [who] then join the national labs as technical leads.

“Don’t forget, there’s a spectrum of skills that we need in terms of introducing the next generation. I think that’s extraordinarily important, because one needs to really demonstrate that quantum technology is not a flash in the pan, it really is something which is going to be lasting, one could build a career in it,” said Knight.

“The other thing I wanted to point out is that national labs provide [what] we can’t get anywhere else; it’s the engineering capability to build things at scale, in a resilient and reliable way, where people have been building big projects as part of their lives forever. So, bringing those people on board to deliver large scale engineering projects and scale. So, remember, skill base is not just PhDs, is the whole ecosystem,” he said.

Are Public-Private Partnership Critical for Nascent QIS Markets?

It’s worth noting there have always been a few productive national lab-private company collaborations. Scaling up the number of these has always been a challenge. Typically, the projects are hard, require substantial resources from both parties, and may prompt difficult IP questions around lessons learned.

Sorensen described the goal well: “[It’s] the idea of taking the good works that happens within the national labs, the expertise, the experience, the insights, and the research results, and trying to transfer those particular gains into the commercial sector in a way that doesn’t create winners or losers.” Accomplishing that goal hasn’t been trivial. That said, the quantum sector is young and one could argue that most of the technical issues being worked on are still pre-commercial.

Humble noted, “One of the unique ways we are partnering with industry is through our leadership computing facility (OLCF), which houses Frontier, currently the [first U.S.] exascale system, fastest supercomputer in the world. OLCF is a user facility in the DOE parlance, which means that it’s a facility by which you can gain access to that machine through a competitive process. We’ve taken that model and applied that to quantum computing access as well. Through our quantum computing user program (with DOE support) we procure access to commercial quantum computing systems, and then run a competitive process by which we award time to investigators, researchers, who can then use those systems to test out their ideas. In addition to that, we require them to make public their findings, and that serves as feedback to the vendors that were providing access itself.”

“What makes this unique is that the laboratory ends up being sort of a neutral playing ground, where you can bring in the users, you can bring in the vendors, you can have an open discussion about the results in the performance, and that can empower each of them to do better than next generation. So, I do think that the laboratories and the centers play an important role in engaging with industry and working together around topics like this,” he said.

Grassellino cited a SQMS collaboration with quantum computer maker Rigetti.

Anna Grassellino, Fermilab

“We took qubits from Rigetti. We dissected them with all the most advanced techniques from national labs. We do all this advanced analysis that otherwise industry would not have access to and we learned the cause of the de-coherence. Then together, we actually implemented new processes, new materials that have now demonstrated systematic improvements in coherence. And we have replicated this foundry in academia, and national labs, and the industry, all working together on the same geometry on the same qubit chips. So, I think that that is just an example of how we’re working, and that these partnerships can work.”

She also cited a project with IBM to better understand cryogenic systems. It turns out that particle accelerators require a lot of cryogenics, hence SQMS’ expertise. “We are here to bring our expertise and help and help ultimately industry succeed where they don’t have assessment tools,” she said.

Government Procurement –  Boosting Both Bottom Line and Bragging Rights

R&D collaborations are nice, but nothing makes the CEO of a young company smile like selling a system to a marquee customer.

Sorensen set the stage well: “Spending and engaging with vendors; it’s a reality that there’s no greater stamp of approval on some level than being part of the Department of Energy procurement. It’s really nice when a vendor can say, we’ve just installed one of our systems at a government lab. But again, in a technology where the future is not clearly defined, you have to make some choices. How does one go about making sure that the procurement policy is the best that they can be?”

In practical terms, at least for hardware, there aren’t a lot of quantum computers one can buy. D-Wave certainly touted its system sale to NASA, for example; IBM has located some of its IBM System Ones at a few facilities, though those are not sales in the traditional sense. The most recent was at the Cleveland Clinic for collaboration on bioresearch.

Humble quipped: “First off, I admire your optimism about the procurement process. But the truth is, from my perspective, it is far more focused on what is the value to the end-user to have access to these systems [and not the vendor community].”

“What we are finding right now is that we need a diversity of technologies, because there’s a diversity of applications to be investigated, as well as the diversity of performance issues to be worked out. And I think that’s true, not just in the [quantum] computing space. So, whether you’re talking about sensors or communication, right now, this technology is very new. And so there are a lot of examples where we just need to try things. And so that’s what’s motivating these selections is to try to get an understanding of what the technology can do before we make longer-term commitments to any particular solution,” said Humble.

The UK perspective was again interesting.

“One of the issues is to make sure that we have those solutions that survive, and investment is necessary for those things to come to fruition. I think government procurement, as an agile customer of early stage work, is really quite important to actually be able to say: we trust this company to do something [that’s] going to be of interest and make sure it works,” said Knight.

“So, I think that that if you want to see some of these startups emerge into something of substance, collaborative R&D with them is all very well, but an order that turns up in their balance sheet is something else. Now, we’re not producing some sort of imprimatur of approval of that product. But we’re encouraging them to survive and generate something we should then test as you bring the user community to use it. So, agile government procurement, as an early stage adopter is really important for us, because it means that we can spread our bets.”

It will be interesting to see if this time next year there are systems to be bought.

[i] https://www.quantum.gov/wp-content/uploads/2023/03/NQIAC-2023-03-24-Written-Comments.pdf

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Intel’s Silicon Brain System a Blueprint for Future AI Computing Architectures

April 24, 2024

Intel is releasing a whole arsenal of AI chips and systems hoping something will stick in the market. Its latest entry is a neuromorphic system called Hala Point. The system includes Intel's research chip called Loihi 2, Read more…

Anders Dam Jensen on HPC Sovereignty, Sustainability, and JU Progress

April 23, 2024

The recent 2024 EuroHPC Summit meeting took place in Antwerp, with attendance substantially up since 2023 to 750 participants. HPCwire asked Intersect360 Research senior analyst Steve Conway, who closely tracks HPC, AI, Read more…

AI Saves the Planet this Earth Day

April 22, 2024

Earth Day was originally conceived as a day of reflection. Our planet’s life-sustaining properties are unlike any other celestial body that we’ve observed, and this day of contemplation is meant to provide all of us Read more…

Intel Announces Hala Point – World’s Largest Neuromorphic System for Sustainable AI

April 22, 2024

As we find ourselves on the brink of a technological revolution, the need for efficient and sustainable computing solutions has never been more critical.  A computer system that can mimic the way humans process and s Read more…

Empowering High-Performance Computing for Artificial Intelligence

April 19, 2024

Artificial intelligence (AI) presents some of the most challenging demands in information technology, especially concerning computing power and data movement. As a result of these challenges, high-performance computing Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

Intel’s Silicon Brain System a Blueprint for Future AI Computing Architectures

April 24, 2024

Intel is releasing a whole arsenal of AI chips and systems hoping something will stick in the market. Its latest entry is a neuromorphic system called Hala Poin Read more…

Anders Dam Jensen on HPC Sovereignty, Sustainability, and JU Progress

April 23, 2024

The recent 2024 EuroHPC Summit meeting took place in Antwerp, with attendance substantially up since 2023 to 750 participants. HPCwire asked Intersect360 Resear Read more…

AI Saves the Planet this Earth Day

April 22, 2024

Earth Day was originally conceived as a day of reflection. Our planet’s life-sustaining properties are unlike any other celestial body that we’ve observed, Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

Intel’s Xeon General Manager Talks about Server Chips 

January 2, 2024

Intel is talking data-center growth and is done digging graves for its dead enterprise products, including GPUs, storage, and networking products, which fell to Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire