Merzbacher Q&A: Deep Dive into the Quantum Economic Development Consortium

By John Russell

March 29, 2022

Building the quantum information sciences (QIS) industry — or more accurately, helping it build itself — is Celia Merzbacher’s job as executive director of the Quantum Economic Development Consortium (QED-C). The QED-C was brought into being with the 2018 National Quantum Initiative Act and is broadly overseen by NIST and the National Quantum Coordination Office.

Merzbacher is a scientist (materials and nanotechnology) who brings a solid mix of research, commercial and government experience to her position. “It’s the perfect job for me,” she says. At the Office of Science and Technology she oversaw National Nanotechnology Initiative. She also served as executive director of the President’s Council of Advisors on Science and Technology (PCAST). At Oak Ridge National Laboratory, she was director of strategic and institutional planning. At the Semiconductor Research Corporation, she was vice president for innovative partnerships. Last year, Merzbacher was elected as a AAAS Fellow.

Celia Merzbacher, QED-C

Fostering development of quantum computing is only part of QED-C’s mission. Understanding the technology and commercial needs of the entire quantum landscape and accelerating its progress is the broad goal. Recently, Merzbacher talked with HPCwire about the challenges and opportunities. In the course of our conversation, she noted: “[You] asked at the beginning a question around what does this whole landscape look like? You would think if anybody had that map, it would be QED-C. But it’s not something that is readily available.”

Presented here are portions of that interview, which covered technology bottlenecks, workforce issues, governance and coordination challenges, and QED-C’s near-term priorities.

HPCwire: Let’s start with some background on QED-C.

Merzbacher: First, we aren’t just about computing. We’re really about all applications of Quantum 2.0, as it’s sometimes called. The organization was actually called for in the 2018 National Quantum Initiative, which said NIST was to establish a consortium of stakeholders. So that’s QED-C. We got what I call startup funding from NIST, but we’re really industry-driven. We have today about 200 non-government members, predominantly corporate, so about 150-plus companies of all sizes from all parts of the quantum ecosystem or, if you want to call it, the supply chain. That includes tier-two and tier-three suppliers of measurement capabilities or equipment or lasers or electronics and other non-quantum but enabling technologies, all the way up to system developers and software and application developers and even end users – who I would say are smartly looking at how quantum technology is advancing and how it might intersect their own future in a sort of disruptive way.

You can see all of our current members on our website. It’s predominantly in terms of numbers of companies, small- and medium- sized companies, although there are all the big players you would think of: IBM, Google, Microsoft, Amazon, etc. Of course, those companies aren’t pure-play quantum companies; they sort of have a little quantum startup inside of them. So all of our members behave a little bit like a small company in some sense. We also have companies like Deloitte and Booz Allen and those kinds of consulting companies.

HPCwire:  The quantum ecosystem has dramatically expanded in recent years. What’s the QED-C mission? And how do you sort of operationalize that mission?

Merzbacher: Our mission easy to say – it’s to enable and grow the quantum industry. QED-C is a very bottom-up type organization. It’s very lean and we have a number of committees where the members come together to do activities. These range from committees on use cases and understanding what the markets and what the applications are to what the size of those markets might be, what the readiness level is of the technology, and how long it’s going to take to get there. Collectively, we have a lot of intelligence on where things are and where they’re going, but we don’t have all the answers yet. Internally, there’s a lot of sort of information being shared. I hope that over time we will be able to put out more digested documentation to help get solid, credible information to the public and to the community who sometimes thinks: Is this just hype? I don’t get it yet. Where’s it going?

We’re in a position to help educate and explain. Over time, I hope we can do that with the sort of use cases work that we’re doing. We recently, for instance, had a workshop that hasn’t produced a report quite yet on quantum computing for the electric grid. So that’s interesting use case; let’s dig in on it a little bit and bring together to that community, which maybe isn’t very expert at quantum, and look at how quantum might be useful in their sector. There’s a committee focused on enabling technologies and we’ve done deep dives and some roadmaps on technologies in cryogenic space, in the laser space, in the electronic space, in a lot of different non-quantum technologies that have specific requirements when used in a quantum application. For example, there’s not really off-the-shelf lasers or electronics that meet those specifications today. Many companies have to either test their own, modify their own, sometimes make their own. There’s a big supply chain issue, frankly, and we’re trying to identify those gaps so that they can be filled.

HPCwire: To your point about the supply chain point, the overlap with DOE work is interesting. I recently talked with quantum researchers at Oak Ridge National Lab and one of the things they’re doing is trying to develop more efficient single-photon sources that would be useful in quantum networks.

Merzbacher: If you go to our public website and you scroll down on the landing page, there is a button that says TACs — or Technical Advisory Committees — … and you can see a description of all the different committees. (List of current TACs and leaders at end of article.)

HPCwire: Within the HPC community, there’s a mixture of attitudes towards quantum technology. A few are excited. Most recognize the long-term potential. But some have grown a little tone-deaf to the stream of promises. What’s your sense of the bottlenecks? What do you see as the rate-limiters to progress?

Merzbacher: As you pointed out in your comments about quantum’s various stakeholders, the bottlenecks are kind of up and down the whole stack. So if what you’re talking about is sensing capabilities for navigation systems, that’s got a different set of technology bottlenecks and requirements than IBM’s superconducting quantum computer. And IBM’s superconducting quantum computer has different bottlenecks from Quantinuum’s trapped-ion system. It’s really kind of a wide-open field.

The other thing that I would say is that at the same time you have all of this activity going on in the private sector, you’ve also got all the investment going on in the public space, the government programs and that money that’s going to Oak Ridge, for instance, or NIST. So those investments need to be aware of what’s going on in the private sector and what industry needs. That’s where QED-C is playing a rather important role by doing these gap analyses at our workshops where we do a deep dive. We did one on single-photon sources and detectors. Not so much on what the technology requirements are, it’s almost one step before that; it turns out there isn’t even agreement on how to measure and characterize those sources. And Oak Ridge is not going to ever be a manufacturer of sources and detectors at any scale; they’re a research institution, they’re trying to push the boundaries of capabilities. You know, I have this bumper sticker I want to make, “leave no photon behind,” because you have a single photon you’re tracking [and] if you have scattering or loss of one photon, you know, it’s a disaster.

So it’s a very different set of requirements from your standard telecom system. You need to really have unique ways to characterize and measure and specify the technology, and that’s not even agreed upon. So a lot of what we talk about is the need for standards, but not the ultimate standards, just standard language to describe a single photon source or detector, for example. That’s a commercial problem. That’s not really a basic research issue. QED-C is really trying to connect the world of the basic researchers, and what they’re doing, which is pushing the boundaries of science, and the people who are trying to figure out how to make that into a business.

HPCwire: How does that coordination happen? As I understand it, there is a formal council for the DOE’s National QIS Centers with a chair drawn from the center directors [that] rotates on an annual basis. Its role is to set priorities for the centers. But we don’t have an overall Quantum Czar.

Merzbacher: Well there is Charlie Tahan in the White House. Right? He’s sitting there at a coordination office (National Quantum Coordination Office) that’s interagency. It doesn’t have the purse strings, he doesn’t have the checkbook, but he has quite a lot of ability to make sure that that what’s happening across all these different departments and agencies is being coordinated. That’s his job,

HPCwire: Good point. How does how does the QED-C govern itself?

Merzbacher: We have a steering committee made up of members who are elected by the membership. Remember, we’re not government. We get a modest amount of funding from the government, but we are not a government agency, we are not controlled by the government, we’re controlled by our members. I have 200 bosses. The steering committee has four representatives from small companies and three representatives from large companies. It’s intentional that more seats are held by small companies. And there are two seats for government agencies. Today, those [latter two seats] happen to be from NIST and the Department of Energy. That’s my board, you could call it. I’ll point out that QEDC is actually not even a legal thing. It’s actually administered by SRI, so I work for SRI International.

HPCwire: I didn’t realize SRI was involved.

Celia Merzbacher: Yes, it’s running the consortium on behalf of the membership and the government.

HPCwire: Relatively speaking, QED-C is still young. How do you gauge your progress? And what are your milestones for this year and next? How do you measure your success?

Merzbacher: It’s challenging. In a sense, we do a lot of what trade associations do and I don’t know how they measure their progress. We’re not a lobbying organization – that’s a bright line because I work for an organization, SRI, which does a lot of government contract work. So we don’t lobby but we can educate. We certainly are about educating government policymakers about what’s happening on the industry side so that they can make smart decisions. That means we go and educate, you know, examiners at the Patent Office, or we meet with people who have a role in responsibility for developing export control regulations, or meet with program managers at DOD or DOE, or NSF and say, ‘Hey, there’s a lack of fundamental understanding in this area. This is the basic research that the industry would love for government to cover, in addition to all the other stuff you do.”

We also do a lot to help our small member companies, just helping them to understand how, as a business, they need to be aware of things. Some of it isn’t specifically quantum. [For example, if] they need to get a handle on what kind of compliance is important when you’re doing government contracts, or pointing them to funding opportunities and helping to helping them find interns or summer jobs because we have universities as members too. We connect the students at the universities to the companies where there are jobs. We do a lot of different things to try to get the bottlenecks unstuck. We’re also working in this area of benchmarking and standards. There are long-used benchmarks for high-performance computing. How do you measure progress in quantum computing today? We have people who are starting to do some work in that kind of space as well.

HPCwire:  What reports does QED-C issue? Are you required to submit (at least to NIST) an annual report or a quarterly update? Are those public documents?

Merzbacher: We meet with NIST constantly. We have a contract between SRI and NIST, so there’s reporting requirements there. The documents we deliver to NIST are not public. We do put out some materials publicly, but this is sort of a classic member-based organization. Members are paying to be members and get some benefits as a result. Some of our reports are shared only among the members and some are made public like one we issued recently, which actually might be of interest. It was on the requirements for the intermediate representation, or the abstraction layer, between hardware and software and quantum computer. We did a workshop, a deep dive on that. The report was one we decided to make public, [because] we want the whole world to be thinking about, you know, how to run lots of different software on lots of different hardware.

That’s an example of a report that we put out. We also put one out last fall called A Guide to a Quantum-Safe Organization. It was aimed more at what I call the long-suffering CIO, who has responsibility for the security of IT systems at the company; they’ve heard of quantum computing, they’re not sure when it’s coming and what they need to be worried about. We put out a report to try to educate people in those kinds of roles about what the threats are from quantum computing and what they should be thinking about today to prepare. That’s a specific aspect of quantum computing — that it has the potential to break encryption.

HPCwire: We haven’t talked much about when quantum information sciences will start to deliver concrete benefits. In quantum computing, the race is to achieve quantum advantage on NISQ (noisy intermediate-scale quantum) computers. What’s your take?

Merzebacher:  I think quantum advantage is what everybody is excited about and eager to see and it’s challenging, for a number of reasons. One is certainly there are really big problems to be overcome to build a quantum computer that is powerful enough. You’ve got problems in everything [like] the quality of the qubits and their fidelity and their connectedness. Then you’ve got problems with error correction, and environmental control and scaling. Scaling, if I had to say, is the biggest issue in the next year or two. Even if you can demonstrate something in the lab, and you’ve got this hero sample that is great, how do you put that into manufacturing and scale it up?

The other problem is you’ve got moving goalposts because existing high-performance computing keeps getting better. It’s hard to project into the future when you’re going to have this crossover. People at the labs, at places like Oak Ridge, who are much more expert than I, say quantum computing is going to be like an accelerator, just like graphical processors were an accelerator. Now we’re going to have quantum processors as accelerators. They’re thinking, at least on the sort of schematic level, of some kind of hybrid system. But to me that [poses] another world of questions, whether it’s the physical architecture, especially if you’re going to have the quantum processor at cryogenic temperatures, it’s going to be separate physically from other processors.

An even harder problem, maybe, is that a quantum computer allows you to ask questions in a different way, right? They’re not a drop-in replacement for a digital binary-based, classical computer. That’s liberating because you can do new things but it’s also hugely difficult because our whole way of thinking about computer science is based on digital. Now, all of a sudden, you have to ask questions in a fundamentally different way, a probabilistic way such that you can’t just hybridize a classical and quantum computer it seems to me. There are a lot of tough computer science problems if what you’re planning to do is have some kind of a hybrid architecture in the long run.

HPCwire: It does seem the prevailing view in HPC is that quantum computers will become a kind of accelerator for special kinds of problems. But back to the timing of achieving quantum advantage. IBM has said 2023 will be the year it delivers quantum advantage with a ~1000-qubit system. What’s your sense?

Merzbacher: I suppose it depends on what the endpoint is. IBM has published a roadmap that you’re quoting. What will that enable? I don’t think that anybody thinks that’s going to be the computer that will break encryption, for instance. Sure, it’ll have some capability. But whether it will prove sufficiently powerful to be a sort of disruptive, practical use for chemical industries, or folks who are have computationally hard problems is unclear. I think now is a great time for IBM and others to be exploring those opportunities.

I have also heard that the Cloud Security Alliance, thinking more about the security side, put something out recently that said the sort of Y2K, moment when we will need to have new encryption in place is around 2030 or 2031. So they’re thinking a really powerful computer that’s able to break encryption is still be almost a decade out. But that is sooner than you think in the sense that it takes years to migrate to a new encryption standard and new cryptographic standards.

HPCwire: Of course, NIST has its Post-Quantum program to help develop public-key cryptography standards able to prevent quantum attacks.

Merzbacher: Right. That program has been going for some time, and it’s making progress. I think they have a target within a year or so to select a new standard. That is sort of just in time, given how long it takes for banks and critical infrastructure and all of the different systems that are out there to adjust. They are all going to have to migrate to a new encryption standard that’s going to take quite a long time. And I think of it as having a sort of a long tail, because you have embedded systems that are not readily upgraded and how important they are to a secure world. We rely so much on the ability to send and receive information.

HPCwire: What’s your sense of when we will see some applications, not security-oriented, in practical use? A few quantum companies say all they need is to be able to generate some “better” random numbers and selectively inject them into algorithms done on traditional systems to get better results. (See the HPCwire article: “Zapata Computing’s Formula for Achieving Quantum Advantage Today”)

Merzbacher: I wouldn’t be surprised. I feel like I have to put one of those big disclaimer statements in front of everything. But as you pointed out, there’s such an intensity of activity going on right now. I don’t know if you go to the Q2B meetings. They had an in-person meeting for the first time last December. It’s every year in Santa Clara and is a sort of gathering of the quantum computing clan. I felt this past year that there was a sort of solid quality to the presentations. They’re very promotional and optimistic, of course, but it seemed like they were more concrete and real progress was being made and it wasn’t just vaporware or hypotheticals. It was starting to feel like it was in that sort of three-year time horizon when real applications and products and capabilities would be in hand.

HPCwire: You’ve mentioned QED-C’s standards and benchmarking efforts. How will the rest of the world get access to the kinds of work you’re doing?

Merzbacher: So, a couple of things. One is we’re working on benchmarks and standards with a little “s”; we’re not a standard-setting organization. But those discussions need to be more open and inclusive. I mean, that’s why standards development organizations are very inclusive. We put on GitHub our benchmark tool, anybody can go and use it. Q-EDC is inclusive to a point. We welcome members who are U.S.-based and we just recently opened up to members from select countries from the closest allies to the U.S. We’re open now to companies from the UK, Australia, Japan, Nordic countries, the Netherlands, Canada.

I just came back from a trip to Europe and there’s a lot of interest in QED-C, and it’s for a number of reasons. Number one is: certainly there’s exciting discovery research going on worldwide. Places like the Netherlands are really hotbeds of quantum R&D as is the UK. They have all these startups happening. Those companies want to tap into the big markets and, of course, that would mean the U.S. along with other places. They all want to be part of Q-EDC so they can kind of have access to customers and have their fingers on the pulse. Then there are suppliers to my members on this side of the ocean. So there’s a lot of interest in collaboration among these sort of like-minded countries and regions, even though they may not agree about everything.

HPCwire: What on your near-term to-do list going forward? What will QED-C focus on?

Merzbacher: One is the supply chain and understanding it better. You even asked at the beginning a question around what does this whole landscape look like? You would think if anybody had that map, it would be QED-C. But it’s not something that is readily available. We are spending some time and working with others to develop a better picture of the supply chain and the whole ecosystem globally, because that way you can strengthen it if it looks like there’s weaknesses or gaps.

We’ve have an ongoing effort to understand the workforce needs of this industry. There’s a lack of skilled workers, and it’s drifting more and more towards people who are at the technician level, not just the advanced degrees but all the way down to the folks you want in the lab. COVID has been disruptive. I was on a call today with somebody who said, “people all want to work from home now, and sorry, but I need people to come in and build stuff in the lab.” There’s a shortage of that kind of worker.

We’re trying to get the word out to the students that you don’t have to get a Ph.D. in physics; if you’re a software engineer or if you’re an optics person, there’s lots of opportunities, maybe you need to take one class or something like that, but there’s a real diversity of skills that are needed. This is an area where we continue to work and try to connect people even at the specific opportunity [level]; tell me who has an intern job they’re trying to fill in and I’ll try to connect you with a student who’s qualified.

So supply chain, workforce, this whole benchmarking and figuring out what’s needed. “Standards” is not really the right word. It’s sort of used a lot. But we’re not really at the stage where we need interoperability standards per se, and [where] we need agreed-upon specifications and metrics and benchmarks. That’s really the stage we’re at with a lot of different technologies. You mentioned single-photon sources. That’s an example of one. We hear the same thing about cryogenic issues. It’s not really understood what the properties of certain materials are at cryogenic temperatures. You’re going to put something down at millikelvin [temperatures] and expect it to perform. Well, we need to understand how do these materials behave at those temperatures, [such as] tables and all kinds of things that are going to be expected to held at low temperatures and perform for periods of time. Some of those are the kind of fundamental materials problems that really a DOE lab or university, even, would be certainly capable of addressing. We’re trying to connect the people with the ability to answer those questions with the folks who are asking them.

HPCwire: Thank you for your time.

About Celia Merzbacher
Dr. Celia Merzbacher is the QED-C Executive Director responsible for continuing to build the consortium and managing operational aspects. Previously, Dr. Merzbacher was Vice President for Innovative Partnerships at the Semiconductor Research Corporation, a consortium of the semiconductor industry. In 2003-2008, she was Assistant Director for Technology R&D in the White House Office of Science and Technology Policy, where she oversaw the establishment and coordination of the National Nanotechnology Initiative. She also served as Executive Director of the President’s Council of Advisors on Science and Technology (PCAST).

Dr. Merzbacher began her career as a materials scientist at the U.S. Naval Research Laboratory in Washington D.C., where her research led to six patents and more than 50 technical publications. She has served as Chair of the National Materials and Manufacturing Board of the National Academies of Science, Engineering and Medicine, on the Board of Directors of ANSI, as well as on advisory boards of several university research centers.

Q-EDC TACS & Chairs (as of 12/21)

New Steering Committee members

  • Eric Holland, Keysight
  • Davide Venturelli, Universities Space Research Association

New TAC leadership

  • Enabling Technologies – Chair: Scott Davis, Vescent Photonics; Vice Chair: Anjul Loiacono, ColdQuanta
  • Quantum for National Security – Chair: Mike Larsen, Northrop Grumman; Vice Chair: Joseph Williams, Pacific Northwest National Laboratory
  • Quantum Law – Chair: Kaniah Konkoly-Thege, Quantinuum; Vice Chair: Ryan McKenney, Orrick
  • Standards and Performance Metrics – Chair: Elliott Mason, Young Basile Hanlon & MacFarlane; Vice Chair: Tom Lubinski, Quantum Circuits (interim)
  • Use Cases – Chair: Mark Danchak, Quantum1 Group; Vice Chair: John Prisco, Safe Quantum
  • Workforce– Chair: Charles Robinson, IBM; Vice Chair: Terrill Frantz, Harrisburg University

Feature image: Honeywell/Continuum’s optical conditioning apparatus for use with its ion trap quantum computer.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Is Time Running Out for Compromise on America COMPETES/USICA Act?

June 22, 2022

You may recall that efforts proposed in 2020 to remake the National Science Foundation (Endless Frontier Act) have since expanded and morphed into two gigantic bills, the America COMPETES Act in the U.S. House of Representatives and the U.S. Innovation and Competition Act in the U.S. Senate. So far, efforts to reconcile the two pieces of legislation have snagged and recent reports... Read more…

Cerebras Systems Thinks Forward on AI Chips as it Claims Performance Win

June 22, 2022

Cerebras Systems makes the largest chip in the world, but is already thinking about its upcoming AI chips as learning models continue to grow at breakneck speed. The company’s latest Wafer Scale Engine chip is indeed the size of a wafer, and is made using TSMC’s 7nm process. The next chip will pack in more cores to handle the fast-growing compute needs of AI, said Andrew Feldman, CEO of Cerebras Systems. Read more…

AMD’s MI300 APUs to Power Exascale El Capitan Supercomputer

June 21, 2022

Additional details of the architecture of the exascale El Capitan supercomputer were disclosed today by Lawrence Livermore National Laboratory’s (LLNL) Terri Quinn in a presentation delivered to the 79th HPC User Forum Read more…

IDC Perspective on Integration of Quantum Computing and HPC

June 20, 2022

The insatiable need to compress time to insights from massive and complex datasets is fueling the demand for quantum computing integration into high performance computing (HPC) environments. Such an integration would allow enterprises to accelerate and optimize current HPC applications and processes by simulating and emulating them on today’s noisy... Read more…

Q&A with Intel’s Jeff McVeigh, an HPCwire Person to Watch in 2022

June 17, 2022

HPCwire presents our interview with Jeff McVeigh, vice president and general manager, Super Compute Group, Intel Corporation, and an HPCwire 2022 Person to Watch. McVeigh shares Intel's plans for the year ahead, his pers Read more…

AWS Solution Channel

Shutterstock 152995403

Bayesian ML Models at Scale with AWS Batch

This post was contributed by Ampersand’s Jeffrey Enos, Senior Machine Learning Engineer, Daniel Gerlanc, Senior Director for Data Science, and Brandon Willard, Data Science Lead. Read more…

Microsoft/NVIDIA Solution Channel

Shutterstock 261863138

Using Cloud-Based, GPU-Accelerated AI for Financial Risk Management

There are strict rules governing financial institutions with a number of global regulatory groups publishing financial compliance requirements. Financial institutions face many challenges and legal responsibilities for risk management, compliance violations, and failure to catch financial fraud. Read more…

Nvidia, Intel to Power Atos-Built MareNostrum 5 Supercomputer

June 16, 2022

The long-troubled, hotly anticipated MareNostrum 5 supercomputer finally has a vendor: Atos, which will be supplying a system that includes both Nvidia and Intel CPUs and GPUs across multiple partitions. The newly reimag Read more…

Is Time Running Out for Compromise on America COMPETES/USICA Act?

June 22, 2022

You may recall that efforts proposed in 2020 to remake the National Science Foundation (Endless Frontier Act) have since expanded and morphed into two gigantic bills, the America COMPETES Act in the U.S. House of Representatives and the U.S. Innovation and Competition Act in the U.S. Senate. So far, efforts to reconcile the two pieces of legislation have snagged and recent reports... Read more…

Cerebras Systems Thinks Forward on AI Chips as it Claims Performance Win

June 22, 2022

Cerebras Systems makes the largest chip in the world, but is already thinking about its upcoming AI chips as learning models continue to grow at breakneck speed. The company’s latest Wafer Scale Engine chip is indeed the size of a wafer, and is made using TSMC’s 7nm process. The next chip will pack in more cores to handle the fast-growing compute needs of AI, said Andrew Feldman, CEO of Cerebras Systems. Read more…

AMD’s MI300 APUs to Power Exascale El Capitan Supercomputer

June 21, 2022

Additional details of the architecture of the exascale El Capitan supercomputer were disclosed today by Lawrence Livermore National Laboratory’s (LLNL) Terri Read more…

IDC Perspective on Integration of Quantum Computing and HPC

June 20, 2022

The insatiable need to compress time to insights from massive and complex datasets is fueling the demand for quantum computing integration into high performance computing (HPC) environments. Such an integration would allow enterprises to accelerate and optimize current HPC applications and processes by simulating and emulating them on today’s noisy... Read more…

Q&A with Intel’s Jeff McVeigh, an HPCwire Person to Watch in 2022

June 17, 2022

HPCwire presents our interview with Jeff McVeigh, vice president and general manager, Super Compute Group, Intel Corporation, and an HPCwire 2022 Person to Watc Read more…

Nvidia, Intel to Power Atos-Built MareNostrum 5 Supercomputer

June 16, 2022

The long-troubled, hotly anticipated MareNostrum 5 supercomputer finally has a vendor: Atos, which will be supplying a system that includes both Nvidia and Inte Read more…

D-Wave Debuts Advantage2 Prototype; Seeks User Exploration and Feedback

June 16, 2022

Starting today, D-Wave Systems is providing access to a 500-plus-qubit prototype of its forthcoming 7000-qubit Advantage2 quantum annealing computer, which is d Read more…

AMD Opens Up Chip Design to the Outside for Custom Future

June 15, 2022

AMD is getting personal with chips as it sets sail to make products more to the liking of its customers. The chipmaker detailed a modular chip future in which customers can mix and match non-AMD processors in a custom chip package. "We are focused on making it easier to implement chips with more flexibility," said Mark Papermaster, chief technology officer at AMD during the analyst day meeting late last week. Read more…

Nvidia R&D Chief on How AI is Improving Chip Design

April 18, 2022

Getting a glimpse into Nvidia’s R&D has become a regular feature of the spring GTC conference with Bill Dally, chief scientist and senior vice president of research, providing an overview of Nvidia’s R&D organization and a few details on current priorities. This year, Dally focused mostly on AI tools that Nvidia is both developing and using in-house to improve... Read more…

Royalty-free stock illustration ID: 1919750255

Intel Says UCIe to Outpace PCIe in Speed Race

May 11, 2022

Intel has shared more details on a new interconnect that is the foundation of the company’s long-term plan for x86, Arm and RISC-V architectures to co-exist in a single chip package. The semiconductor company is taking a modular approach to chip design with the option for customers to cram computing blocks such as CPUs, GPUs and AI accelerators inside a single chip package. Read more…

The Final Frontier: US Has Its First Exascale Supercomputer

May 30, 2022

In April 2018, the U.S. Department of Energy announced plans to procure a trio of exascale supercomputers at a total cost of up to $1.8 billion dollars. Over the ensuing four years, many announcements were made, many deadlines were missed, and a pandemic threw the world into disarray. Now, at long last, HPE and Oak Ridge National Laboratory (ORNL) have announced that the first of those... Read more…

AMD/Xilinx Takes Aim at Nvidia with Improved VCK5000 Inferencing Card

March 8, 2022

AMD/Xilinx has released an improved version of its VCK5000 AI inferencing card along with a series of competitive benchmarks aimed directly at Nvidia’s GPU line. AMD says the new VCK5000 has 3x better performance than earlier versions and delivers 2x TCO over Nvidia T4. AMD also showed favorable benchmarks against several Nvidia GPUs, claiming its VCK5000 achieved... Read more…

Top500: Exascale Is Officially Here with Debut of Frontier

May 30, 2022

The 59th installment of the Top500 list, issued today from ISC 2022 in Hamburg, Germany, officially marks a new era in supercomputing with the debut of the first-ever exascale system on the list. Frontier, deployed at the Department of Energy’s Oak Ridge National Laboratory, achieved 1.102 exaflops in its fastest High Performance Linpack run, which was completed... Read more…

Newly-Observed Higgs Mode Holds Promise in Quantum Computing

June 8, 2022

The first-ever appearance of a previously undetectable quantum excitation known as the axial Higgs mode – exciting in its own right – also holds promise for developing and manipulating higher temperature quantum materials... Read more…

Nvidia Launches Hopper H100 GPU, New DGXs and Grace Superchips

March 22, 2022

The battle for datacenter dominance keeps getting hotter. Today, Nvidia kicked off its spring GTC event with new silicon, new software and a new supercomputer. Speaking from a virtual environment in the Nvidia Omniverse 3D collaboration and simulation platform, CEO Jensen Huang introduced the new Hopper GPU architecture and the H100 GPU... Read more…

PsiQuantum’s Path to 1 Million Qubits

April 21, 2022

PsiQuantum, founded in 2016 by four researchers with roots at Bristol University, Stanford University, and York University, is one of a few quantum computing startups that’s kept a moderately low PR profile. (That’s if you disregard the roughly $700 million in funding it has attracted.) The main reason is PsiQuantum has eschewed the clamorous public chase for... Read more…

Leading Solution Providers

Contributors

ISC 2022 Booth Video Tours

AMD
AWS
DDN
Dell
Intel
Lenovo
Microsoft
PENGUIN SOLUTIONS

Intel Reiterates Plans to Merge CPU, GPU High-performance Chip Roadmaps

May 31, 2022

Intel reiterated it is well on its way to merging its roadmap of high-performance CPUs and GPUs as it shifts over to newer manufacturing processes and packaging technologies in the coming years. The company is merging the CPU and GPU lineups into a chip (codenamed Falcon Shores) which Intel has dubbed an XPU. Falcon Shores... Read more…

AMD Opens Up Chip Design to the Outside for Custom Future

June 15, 2022

AMD is getting personal with chips as it sets sail to make products more to the liking of its customers. The chipmaker detailed a modular chip future in which customers can mix and match non-AMD processors in a custom chip package. "We are focused on making it easier to implement chips with more flexibility," said Mark Papermaster, chief technology officer at AMD during the analyst day meeting late last week. Read more…

India Launches Petascale ‘PARAM Ganga’ Supercomputer

March 8, 2022

Just a couple of weeks ago, the Indian government promised that it had five HPC systems in the final stages of installation and would launch nine new supercomputers this year. Now, it appears to be making good on that promise: the country’s National Supercomputing Mission (NSM) has announced the deployment of “PARAM Ganga” petascale supercomputer at Indian Institute of Technology (IIT)... Read more…

Nvidia Dominates MLPerf Inference, Qualcomm also Shines, Where’s Everybody Else?

April 6, 2022

MLCommons today released its latest MLPerf inferencing results, with another strong showing by Nvidia accelerators inside a diverse array of systems. Roughly fo Read more…

AMD’s MI300 APUs to Power Exascale El Capitan Supercomputer

June 21, 2022

Additional details of the architecture of the exascale El Capitan supercomputer were disclosed today by Lawrence Livermore National Laboratory’s (LLNL) Terri Read more…

Nvidia, Intel to Power Atos-Built MareNostrum 5 Supercomputer

June 16, 2022

The long-troubled, hotly anticipated MareNostrum 5 supercomputer finally has a vendor: Atos, which will be supplying a system that includes both Nvidia and Inte Read more…

Industry Consortium Forms to Drive UCIe Chiplet Interconnect Standard

March 2, 2022

A new industry consortium aims to establish a die-to-die interconnect standard – Universal Chiplet Interconnect Express (UCIe) – in support of an open chipl Read more…

Covid Policies at HPC Conferences Should Reflect HPC Research

June 6, 2022

Supercomputing has been indispensable throughout the Covid-19 pandemic, from modeling the virus and its spread to designing vaccines and therapeutics. But, desp Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire