Merzbacher Q&A: Deep Dive into the Quantum Economic Development Consortium

By John Russell

March 29, 2022

Building the quantum information sciences (QIS) industry — or more accurately, helping it build itself — is Celia Merzbacher’s job as executive director of the Quantum Economic Development Consortium (QED-C). The QED-C was brought into being with the 2018 National Quantum Initiative Act and is broadly overseen by NIST and the National Quantum Coordination Office.

Merzbacher is a scientist (materials and nanotechnology) who brings a solid mix of research, commercial and government experience to her position. “It’s the perfect job for me,” she says. At the Office of Science and Technology she oversaw National Nanotechnology Initiative. She also served as executive director of the President’s Council of Advisors on Science and Technology (PCAST). At Oak Ridge National Laboratory, she was director of strategic and institutional planning. At the Semiconductor Research Corporation, she was vice president for innovative partnerships. Last year, Merzbacher was elected as a AAAS Fellow.

Celia Merzbacher, QED-C

Fostering development of quantum computing is only part of QED-C’s mission. Understanding the technology and commercial needs of the entire quantum landscape and accelerating its progress is the broad goal. Recently, Merzbacher talked with HPCwire about the challenges and opportunities. In the course of our conversation, she noted: “[You] asked at the beginning a question around what does this whole landscape look like? You would think if anybody had that map, it would be QED-C. But it’s not something that is readily available.”

Presented here are portions of that interview, which covered technology bottlenecks, workforce issues, governance and coordination challenges, and QED-C’s near-term priorities.

HPCwire: Let’s start with some background on QED-C.

Merzbacher: First, we aren’t just about computing. We’re really about all applications of Quantum 2.0, as it’s sometimes called. The organization was actually called for in the 2018 National Quantum Initiative, which said NIST was to establish a consortium of stakeholders. So that’s QED-C. We got what I call startup funding from NIST, but we’re really industry-driven. We have today about 200 non-government members, predominantly corporate, so about 150-plus companies of all sizes from all parts of the quantum ecosystem or, if you want to call it, the supply chain. That includes tier-two and tier-three suppliers of measurement capabilities or equipment or lasers or electronics and other non-quantum but enabling technologies, all the way up to system developers and software and application developers and even end users – who I would say are smartly looking at how quantum technology is advancing and how it might intersect their own future in a sort of disruptive way.

You can see all of our current members on our website. It’s predominantly in terms of numbers of companies, small- and medium- sized companies, although there are all the big players you would think of: IBM, Google, Microsoft, Amazon, etc. Of course, those companies aren’t pure-play quantum companies; they sort of have a little quantum startup inside of them. So all of our members behave a little bit like a small company in some sense. We also have companies like Deloitte and Booz Allen and those kinds of consulting companies.

HPCwire:  The quantum ecosystem has dramatically expanded in recent years. What’s the QED-C mission? And how do you sort of operationalize that mission?

Merzbacher: Our mission easy to say – it’s to enable and grow the quantum industry. QED-C is a very bottom-up type organization. It’s very lean and we have a number of committees where the members come together to do activities. These range from committees on use cases and understanding what the markets and what the applications are to what the size of those markets might be, what the readiness level is of the technology, and how long it’s going to take to get there. Collectively, we have a lot of intelligence on where things are and where they’re going, but we don’t have all the answers yet. Internally, there’s a lot of sort of information being shared. I hope that over time we will be able to put out more digested documentation to help get solid, credible information to the public and to the community who sometimes thinks: Is this just hype? I don’t get it yet. Where’s it going?

We’re in a position to help educate and explain. Over time, I hope we can do that with the sort of use cases work that we’re doing. We recently, for instance, had a workshop that hasn’t produced a report quite yet on quantum computing for the electric grid. So that’s interesting use case; let’s dig in on it a little bit and bring together to that community, which maybe isn’t very expert at quantum, and look at how quantum might be useful in their sector. There’s a committee focused on enabling technologies and we’ve done deep dives and some roadmaps on technologies in cryogenic space, in the laser space, in the electronic space, in a lot of different non-quantum technologies that have specific requirements when used in a quantum application. For example, there’s not really off-the-shelf lasers or electronics that meet those specifications today. Many companies have to either test their own, modify their own, sometimes make their own. There’s a big supply chain issue, frankly, and we’re trying to identify those gaps so that they can be filled.

HPCwire: To your point about the supply chain point, the overlap with DOE work is interesting. I recently talked with quantum researchers at Oak Ridge National Lab and one of the things they’re doing is trying to develop more efficient single-photon sources that would be useful in quantum networks.

Merzbacher: If you go to our public website and you scroll down on the landing page, there is a button that says TACs — or Technical Advisory Committees — … and you can see a description of all the different committees. (List of current TACs and leaders at end of article.)

HPCwire: Within the HPC community, there’s a mixture of attitudes towards quantum technology. A few are excited. Most recognize the long-term potential. But some have grown a little tone-deaf to the stream of promises. What’s your sense of the bottlenecks? What do you see as the rate-limiters to progress?

Merzbacher: As you pointed out in your comments about quantum’s various stakeholders, the bottlenecks are kind of up and down the whole stack. So if what you’re talking about is sensing capabilities for navigation systems, that’s got a different set of technology bottlenecks and requirements than IBM’s superconducting quantum computer. And IBM’s superconducting quantum computer has different bottlenecks from Quantinuum’s trapped-ion system. It’s really kind of a wide-open field.

The other thing that I would say is that at the same time you have all of this activity going on in the private sector, you’ve also got all the investment going on in the public space, the government programs and that money that’s going to Oak Ridge, for instance, or NIST. So those investments need to be aware of what’s going on in the private sector and what industry needs. That’s where QED-C is playing a rather important role by doing these gap analyses at our workshops where we do a deep dive. We did one on single-photon sources and detectors. Not so much on what the technology requirements are, it’s almost one step before that; it turns out there isn’t even agreement on how to measure and characterize those sources. And Oak Ridge is not going to ever be a manufacturer of sources and detectors at any scale; they’re a research institution, they’re trying to push the boundaries of capabilities. You know, I have this bumper sticker I want to make, “leave no photon behind,” because you have a single photon you’re tracking [and] if you have scattering or loss of one photon, you know, it’s a disaster.

So it’s a very different set of requirements from your standard telecom system. You need to really have unique ways to characterize and measure and specify the technology, and that’s not even agreed upon. So a lot of what we talk about is the need for standards, but not the ultimate standards, just standard language to describe a single photon source or detector, for example. That’s a commercial problem. That’s not really a basic research issue. QED-C is really trying to connect the world of the basic researchers, and what they’re doing, which is pushing the boundaries of science, and the people who are trying to figure out how to make that into a business.

HPCwire: How does that coordination happen? As I understand it, there is a formal council for the DOE’s National QIS Centers with a chair drawn from the center directors [that] rotates on an annual basis. Its role is to set priorities for the centers. But we don’t have an overall Quantum Czar.

Merzbacher: Well there is Charlie Tahan in the White House. Right? He’s sitting there at a coordination office (National Quantum Coordination Office) that’s interagency. It doesn’t have the purse strings, he doesn’t have the checkbook, but he has quite a lot of ability to make sure that that what’s happening across all these different departments and agencies is being coordinated. That’s his job,

HPCwire: Good point. How does how does the QED-C govern itself?

Merzbacher: We have a steering committee made up of members who are elected by the membership. Remember, we’re not government. We get a modest amount of funding from the government, but we are not a government agency, we are not controlled by the government, we’re controlled by our members. I have 200 bosses. The steering committee has four representatives from small companies and three representatives from large companies. It’s intentional that more seats are held by small companies. And there are two seats for government agencies. Today, those [latter two seats] happen to be from NIST and the Department of Energy. That’s my board, you could call it. I’ll point out that QEDC is actually not even a legal thing. It’s actually administered by SRI, so I work for SRI International.

HPCwire: I didn’t realize SRI was involved.

Celia Merzbacher: Yes, it’s running the consortium on behalf of the membership and the government.

HPCwire: Relatively speaking, QED-C is still young. How do you gauge your progress? And what are your milestones for this year and next? How do you measure your success?

Merzbacher: It’s challenging. In a sense, we do a lot of what trade associations do and I don’t know how they measure their progress. We’re not a lobbying organization – that’s a bright line because I work for an organization, SRI, which does a lot of government contract work. So we don’t lobby but we can educate. We certainly are about educating government policymakers about what’s happening on the industry side so that they can make smart decisions. That means we go and educate, you know, examiners at the Patent Office, or we meet with people who have a role in responsibility for developing export control regulations, or meet with program managers at DOD or DOE, or NSF and say, ‘Hey, there’s a lack of fundamental understanding in this area. This is the basic research that the industry would love for government to cover, in addition to all the other stuff you do.”

We also do a lot to help our small member companies, just helping them to understand how, as a business, they need to be aware of things. Some of it isn’t specifically quantum. [For example, if] they need to get a handle on what kind of compliance is important when you’re doing government contracts, or pointing them to funding opportunities and helping to helping them find interns or summer jobs because we have universities as members too. We connect the students at the universities to the companies where there are jobs. We do a lot of different things to try to get the bottlenecks unstuck. We’re also working in this area of benchmarking and standards. There are long-used benchmarks for high-performance computing. How do you measure progress in quantum computing today? We have people who are starting to do some work in that kind of space as well.

HPCwire:  What reports does QED-C issue? Are you required to submit (at least to NIST) an annual report or a quarterly update? Are those public documents?

Merzbacher: We meet with NIST constantly. We have a contract between SRI and NIST, so there’s reporting requirements there. The documents we deliver to NIST are not public. We do put out some materials publicly, but this is sort of a classic member-based organization. Members are paying to be members and get some benefits as a result. Some of our reports are shared only among the members and some are made public like one we issued recently, which actually might be of interest. It was on the requirements for the intermediate representation, or the abstraction layer, between hardware and software and quantum computer. We did a workshop, a deep dive on that. The report was one we decided to make public, [because] we want the whole world to be thinking about, you know, how to run lots of different software on lots of different hardware.

That’s an example of a report that we put out. We also put one out last fall called A Guide to a Quantum-Safe Organization. It was aimed more at what I call the long-suffering CIO, who has responsibility for the security of IT systems at the company; they’ve heard of quantum computing, they’re not sure when it’s coming and what they need to be worried about. We put out a report to try to educate people in those kinds of roles about what the threats are from quantum computing and what they should be thinking about today to prepare. That’s a specific aspect of quantum computing — that it has the potential to break encryption.

HPCwire: We haven’t talked much about when quantum information sciences will start to deliver concrete benefits. In quantum computing, the race is to achieve quantum advantage on NISQ (noisy intermediate-scale quantum) computers. What’s your take?

Merzebacher:  I think quantum advantage is what everybody is excited about and eager to see and it’s challenging, for a number of reasons. One is certainly there are really big problems to be overcome to build a quantum computer that is powerful enough. You’ve got problems in everything [like] the quality of the qubits and their fidelity and their connectedness. Then you’ve got problems with error correction, and environmental control and scaling. Scaling, if I had to say, is the biggest issue in the next year or two. Even if you can demonstrate something in the lab, and you’ve got this hero sample that is great, how do you put that into manufacturing and scale it up?

The other problem is you’ve got moving goalposts because existing high-performance computing keeps getting better. It’s hard to project into the future when you’re going to have this crossover. People at the labs, at places like Oak Ridge, who are much more expert than I, say quantum computing is going to be like an accelerator, just like graphical processors were an accelerator. Now we’re going to have quantum processors as accelerators. They’re thinking, at least on the sort of schematic level, of some kind of hybrid system. But to me that [poses] another world of questions, whether it’s the physical architecture, especially if you’re going to have the quantum processor at cryogenic temperatures, it’s going to be separate physically from other processors.

An even harder problem, maybe, is that a quantum computer allows you to ask questions in a different way, right? They’re not a drop-in replacement for a digital binary-based, classical computer. That’s liberating because you can do new things but it’s also hugely difficult because our whole way of thinking about computer science is based on digital. Now, all of a sudden, you have to ask questions in a fundamentally different way, a probabilistic way such that you can’t just hybridize a classical and quantum computer it seems to me. There are a lot of tough computer science problems if what you’re planning to do is have some kind of a hybrid architecture in the long run.

HPCwire: It does seem the prevailing view in HPC is that quantum computers will become a kind of accelerator for special kinds of problems. But back to the timing of achieving quantum advantage. IBM has said 2023 will be the year it delivers quantum advantage with a ~1000-qubit system. What’s your sense?

Merzbacher: I suppose it depends on what the endpoint is. IBM has published a roadmap that you’re quoting. What will that enable? I don’t think that anybody thinks that’s going to be the computer that will break encryption, for instance. Sure, it’ll have some capability. But whether it will prove sufficiently powerful to be a sort of disruptive, practical use for chemical industries, or folks who are have computationally hard problems is unclear. I think now is a great time for IBM and others to be exploring those opportunities.

I have also heard that the Cloud Security Alliance, thinking more about the security side, put something out recently that said the sort of Y2K, moment when we will need to have new encryption in place is around 2030 or 2031. So they’re thinking a really powerful computer that’s able to break encryption is still be almost a decade out. But that is sooner than you think in the sense that it takes years to migrate to a new encryption standard and new cryptographic standards.

HPCwire: Of course, NIST has its Post-Quantum program to help develop public-key cryptography standards able to prevent quantum attacks.

Merzbacher: Right. That program has been going for some time, and it’s making progress. I think they have a target within a year or so to select a new standard. That is sort of just in time, given how long it takes for banks and critical infrastructure and all of the different systems that are out there to adjust. They are all going to have to migrate to a new encryption standard that’s going to take quite a long time. And I think of it as having a sort of a long tail, because you have embedded systems that are not readily upgraded and how important they are to a secure world. We rely so much on the ability to send and receive information.

HPCwire: What’s your sense of when we will see some applications, not security-oriented, in practical use? A few quantum companies say all they need is to be able to generate some “better” random numbers and selectively inject them into algorithms done on traditional systems to get better results. (See the HPCwire article: “Zapata Computing’s Formula for Achieving Quantum Advantage Today”)

Merzbacher: I wouldn’t be surprised. I feel like I have to put one of those big disclaimer statements in front of everything. But as you pointed out, there’s such an intensity of activity going on right now. I don’t know if you go to the Q2B meetings. They had an in-person meeting for the first time last December. It’s every year in Santa Clara and is a sort of gathering of the quantum computing clan. I felt this past year that there was a sort of solid quality to the presentations. They’re very promotional and optimistic, of course, but it seemed like they were more concrete and real progress was being made and it wasn’t just vaporware or hypotheticals. It was starting to feel like it was in that sort of three-year time horizon when real applications and products and capabilities would be in hand.

HPCwire: You’ve mentioned QED-C’s standards and benchmarking efforts. How will the rest of the world get access to the kinds of work you’re doing?

Merzbacher: So, a couple of things. One is we’re working on benchmarks and standards with a little “s”; we’re not a standard-setting organization. But those discussions need to be more open and inclusive. I mean, that’s why standards development organizations are very inclusive. We put on GitHub our benchmark tool, anybody can go and use it. Q-EDC is inclusive to a point. We welcome members who are U.S.-based and we just recently opened up to members from select countries from the closest allies to the U.S. We’re open now to companies from the UK, Australia, Japan, Nordic countries, the Netherlands, Canada.

I just came back from a trip to Europe and there’s a lot of interest in QED-C, and it’s for a number of reasons. Number one is: certainly there’s exciting discovery research going on worldwide. Places like the Netherlands are really hotbeds of quantum R&D as is the UK. They have all these startups happening. Those companies want to tap into the big markets and, of course, that would mean the U.S. along with other places. They all want to be part of Q-EDC so they can kind of have access to customers and have their fingers on the pulse. Then there are suppliers to my members on this side of the ocean. So there’s a lot of interest in collaboration among these sort of like-minded countries and regions, even though they may not agree about everything.

HPCwire: What on your near-term to-do list going forward? What will QED-C focus on?

Merzbacher: One is the supply chain and understanding it better. You even asked at the beginning a question around what does this whole landscape look like? You would think if anybody had that map, it would be QED-C. But it’s not something that is readily available. We are spending some time and working with others to develop a better picture of the supply chain and the whole ecosystem globally, because that way you can strengthen it if it looks like there’s weaknesses or gaps.

We’ve have an ongoing effort to understand the workforce needs of this industry. There’s a lack of skilled workers, and it’s drifting more and more towards people who are at the technician level, not just the advanced degrees but all the way down to the folks you want in the lab. COVID has been disruptive. I was on a call today with somebody who said, “people all want to work from home now, and sorry, but I need people to come in and build stuff in the lab.” There’s a shortage of that kind of worker.

We’re trying to get the word out to the students that you don’t have to get a Ph.D. in physics; if you’re a software engineer or if you’re an optics person, there’s lots of opportunities, maybe you need to take one class or something like that, but there’s a real diversity of skills that are needed. This is an area where we continue to work and try to connect people even at the specific opportunity [level]; tell me who has an intern job they’re trying to fill in and I’ll try to connect you with a student who’s qualified.

So supply chain, workforce, this whole benchmarking and figuring out what’s needed. “Standards” is not really the right word. It’s sort of used a lot. But we’re not really at the stage where we need interoperability standards per se, and [where] we need agreed-upon specifications and metrics and benchmarks. That’s really the stage we’re at with a lot of different technologies. You mentioned single-photon sources. That’s an example of one. We hear the same thing about cryogenic issues. It’s not really understood what the properties of certain materials are at cryogenic temperatures. You’re going to put something down at millikelvin [temperatures] and expect it to perform. Well, we need to understand how do these materials behave at those temperatures, [such as] tables and all kinds of things that are going to be expected to held at low temperatures and perform for periods of time. Some of those are the kind of fundamental materials problems that really a DOE lab or university, even, would be certainly capable of addressing. We’re trying to connect the people with the ability to answer those questions with the folks who are asking them.

HPCwire: Thank you for your time.

About Celia Merzbacher
Dr. Celia Merzbacher is the QED-C Executive Director responsible for continuing to build the consortium and managing operational aspects. Previously, Dr. Merzbacher was Vice President for Innovative Partnerships at the Semiconductor Research Corporation, a consortium of the semiconductor industry. In 2003-2008, she was Assistant Director for Technology R&D in the White House Office of Science and Technology Policy, where she oversaw the establishment and coordination of the National Nanotechnology Initiative. She also served as Executive Director of the President’s Council of Advisors on Science and Technology (PCAST).

Dr. Merzbacher began her career as a materials scientist at the U.S. Naval Research Laboratory in Washington D.C., where her research led to six patents and more than 50 technical publications. She has served as Chair of the National Materials and Manufacturing Board of the National Academies of Science, Engineering and Medicine, on the Board of Directors of ANSI, as well as on advisory boards of several university research centers.

Q-EDC TACS & Chairs (as of 12/21)

New Steering Committee members

  • Eric Holland, Keysight
  • Davide Venturelli, Universities Space Research Association

New TAC leadership

  • Enabling Technologies – Chair: Scott Davis, Vescent Photonics; Vice Chair: Anjul Loiacono, ColdQuanta
  • Quantum for National Security – Chair: Mike Larsen, Northrop Grumman; Vice Chair: Joseph Williams, Pacific Northwest National Laboratory
  • Quantum Law – Chair: Kaniah Konkoly-Thege, Quantinuum; Vice Chair: Ryan McKenney, Orrick
  • Standards and Performance Metrics – Chair: Elliott Mason, Young Basile Hanlon & MacFarlane; Vice Chair: Tom Lubinski, Quantum Circuits (interim)
  • Use Cases – Chair: Mark Danchak, Quantum1 Group; Vice Chair: John Prisco, Safe Quantum
  • Workforce– Chair: Charles Robinson, IBM; Vice Chair: Terrill Frantz, Harrisburg University

Feature image: Honeywell/Continuum’s optical conditioning apparatus for use with its ion trap quantum computer.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire