Quantum Computing Steps Out of the Research Lab

By Michael Feldman

February 16, 2007

On Tuesday at the Computer History Museum in Mountain View, California, a Canadian tech startup called D-Wave demonstrated a prototype of a commercial quantum computer. The company claims their 16-qubit system is by far the most powerful quantum computer ever built and the first ever to run commercial applications. The purpose of the demonstration was to provide “proof-of-concept” for upcoming commercial products.

While many researchers have estimated that quantum devices will not be commercially viable for another 20 to 50 years, D-Wave founder and CTO Geordie Rose has aggressively pursued his dream of developing a commercial device in a much shorter timeframe. In 1999, he formed D-Wave to begin his pursuit of superconductor-based quantum computing. A superconductor implementation was chosen because unlike other QC approaches, such as quantum dots or optical circuits, it does not rely on the development of future technologies.

Unlike bits in digital computers, quantum computers contain quantum bits (qubits), which can exist as 0, 1, or a superposition of both. The property of superposition is at the heart of quantum computing.

The D-Wave system relies on a technology called adiabatic quantum computing to do its work. The hardware consists of a 4×4 array of magnetic flux qubits, which are implemented as niobium rings. At temperatures close to absolute zero they become superconducting, enabling them to behave quantum mechanically. Because of the quantum mechanical behavior, the 16-qubit system is able to perform 64K calculations simultaneously.

The demonstration used the D-Wave prototype system, called Orion, running remotely at the company's headquarters in Burnaby, Canada. Three different applications were put through their paces. The first was a pattern matching application used to search a databases of molecules. The second was a seating plan application, where wedding seat assignments were subject to a number of constraints. The third application demonstrated solutions to the Suduko puzzle.

The algorithms were adapted such that they were recast as combinatorial graphs. A conventional digital preprocessor ran the applications, but the graphs were sent to the QC hardware, where they were distributed across the qubit array.

If this sounds like a lot of trouble for searching a database or assigning some seats, the real payoff comes when the system is scaled up to thousands of qubits. Quantum computers of this size should be able to solve problems that cannot be solved by any conventional computer, no matter how large powerful.

“There are problems out there that just don't scale polynomially, they scale exponentially,” says D-Wave CEO Herb Martin.

He is referring to NP-complete problems, which require examining a very large number of possibilities. For these types of problems, computation time on a conventional digital computer goes up exponentially as the number of combinations increases. An example is the subset sum problem, which is important to cryptography. The problem may be stated as follows: for a given set of integers, does a subset of the numbers exist, which when added together, equals zero? For example, in the set {-7, -3, -2, 5, 8}, the subset {-3, -2, 5} is the solution. A digital computer would be able to determine this in a fraction of a second. However, if the given set of numbers grew to a couple of hundred elements, it would take billions of years for the computer to solve it. A quantum computer of reasonable size could solve it almost instantly.

Or could it? D-Wave's Geordie Rose admits that using quantum computers to achieve exact solutions to NP-complete problems is unproven. D-Wave's specific claim is that these systems will be able to derive very useful “approximate solutions” for such applications, where the problem does not require an exact solution.
 
Virtually any industry has applications that could make use of this capability. This applies to most real-world problems where the number of combinations limits how fast a conventional computer can generate a useful solution. Applications like protein folding, drug discovery, genomics, machine vision, security biometrics, quantitative finances, data mining, VLSI layout, nanoscale simulation, supply chain management, and many others can be re-cast as QC-native algorithms. All of these problems are currently being addressed with conventional computers, but the scale of the algorithm will always be limited by the digital nature of the computation.

This is not to suggest that conventional computers are doomed to extinction. The folks at D-Wave believe that quantum devices will augment digital computers, much as a hardware accelerator is used today. This seems to be a widely held view in the computing community.

“From a business perspective, I think that quantum computers are never going to completely displace classical supercomputers,” said Colin Williams, a senior QC researcher at JPL. “What I foresee is a sort of symbiotic relationship, where you have something akin to a quantum co-processor and the classical supercomputer would farm out specific questions for the quantum co-processor to answer; and then it would get the answer and incorporate that into its own ongoing computation.”

But despite this week's demonstration, the question of quantum computing's viability remains. There is certainly no shortage of D-Wave skeptics. QC researchers note that the company has not published their work in peer-reviewed journals, and have doubts that the company's offering represents true quantum computing. At the center of the controversy is whether adiabatic quantum computation is all it's cracked up to be. For the adiabatic model to work, the computation must be driven fast enough to give you the answer in a useful timeframe, but slow enough so as to maintain the adiabatic condition. Many believe that the process may not be feasible. The real proof point will be when a larger-qubit machine solves an NP-complete problem of sufficient size to demonstrate the expected quantum computing acceleration.

While the prototype demonstrated this week is not ready to do this, D-Wave has used this opportunity to get the word out that QC is not just something relegated to the research labs. According to CEO Herb Martin, the company is planning to release an online system in Q4 of 2007. This 32-qubit machine will be made available to the open source community to encourage users to port their applications to the company's platform. Beyond that, D-Wave intends to deliver a commercial 512-qubit machine in mid-2008 and a 1,024-qubit system by the end of that year. Stay tuned.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Intel’s Optane/DAOS Solution Tops Latest IO500

August 11, 2020

Intel’s persistent memory technology, Optane, and its DAOS (Distributed Asynchronous Object Storage) stack continue to impress and gain market traction. Yesterday, Intel reported an Optane and DAOS-based system finishe Read more…

By John Russell

Summit Now Offers Virtual Tours

August 10, 2020

Summit, the second most powerful publicly ranked supercomputer in the world, now has a virtual tour. The tour, implemented by 3D platform Matterport, allows users to virtually “walk” around the massive supercomputer Read more…

By Oliver Peckham

Supercomputer Simulations Examine Changes in Chesapeake Bay

August 8, 2020

The Chesapeake Bay, the largest estuary in the continental United States, weaves its way south from Maryland, collecting waters from West Virginia, Delaware, DC, Pennsylvania and New York along the way. Like many major e Read more…

By Oliver Peckham

Student Success from ‘Scratch’: CHPC’s Proof is in the Pudding

August 7, 2020

Happy Sithole, who directs the South African Centre for High Performance Computing (SA-CHPC), called the 13th annual CHPC National conference to order on December 1, 2019, at the Birchwood Conference Centre in Kempton Pa Read more…

By Elizabeth Leake

New GE Simulations on Summit to Advance Offshore Wind Power

August 6, 2020

The wind energy sector is a frequent user of high-power simulations, with researchers aiming to optimize wind flows and energy production from the massive turbines. Now, researchers at GE are preparing to undertake a lar Read more…

By Oliver Peckham

AWS Solution Channel

AWS announces the release of AWS ParallelCluster 2.8.0

AWS ParallelCluster is a fully supported and maintained open source cluster management tool that makes it easy for scientists, researchers, and IT administrators to deploy and manage High Performance Computing (HPC) clusters in the AWS cloud. Read more…

Intel® HPC + AI Pavilion

Supercomputing the Pandemic: Scientific Community Tackles COVID-19 from Multiple Perspectives

Since their inception, supercomputers have taken on the biggest, most complex, and most data-intensive computing challenges—from confirming Einstein’s theories about gravitational waves to predicting the impacts of climate change. Read more…

Research: A Survey of Numerical Methods Utilizing Mixed Precision Arithmetic

August 5, 2020

Within the past years, hardware vendors have started designing low precision special function units in response to the demand of the machine learning community and their demand for high compute power in low precision for Read more…

By Hartwig Anzt and Jack Dongarra

Intel’s Optane/DAOS Solution Tops Latest IO500

August 11, 2020

Intel’s persistent memory technology, Optane, and its DAOS (Distributed Asynchronous Object Storage) stack continue to impress and gain market traction. Yeste Read more…

By John Russell

Summit Now Offers Virtual Tours

August 10, 2020

Summit, the second most powerful publicly ranked supercomputer in the world, now has a virtual tour. The tour, implemented by 3D platform Matterport, allows use Read more…

By Oliver Peckham

Research: A Survey of Numerical Methods Utilizing Mixed Precision Arithmetic

August 5, 2020

Within the past years, hardware vendors have started designing low precision special function units in response to the demand of the machine learning community Read more…

By Hartwig Anzt and Jack Dongarra

Implement Photonic Tensor Cores for Machine Learning?

August 5, 2020

Researchers from George Washington University have reported an approach for building photonic tensor cores that leverages phase change photonic memory to implem Read more…

By John Russell

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

Machines, Connections, Data, and Especially People: OAC Acting Director Amy Friedlander Charts Office’s Blueprint for Innovation

August 3, 2020

The path to innovation in cyberinfrastructure (CI) will require continued focus on building HPC systems and secure connections between them, in addition to the Read more…

By Ken Chiacchia, Pittsburgh Supercomputing Center/XSEDE

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

Supercomputer-Powered Research Uncovers Signs of ‘Bradykinin Storm’ That May Explain COVID-19 Symptoms

July 28, 2020

Doctors and medical researchers have struggled to pinpoint – let alone explain – the deluge of symptoms induced by COVID-19 infections in patients, and what Read more…

By Oliver Peckham

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer

June 9, 2020

Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…

By Tiffany Trader

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

Leading Solution Providers

Contributors

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Joliot-Curie Supercomputer Used to Build First Full, High-Fidelity Aircraft Engine Simulation

July 14, 2020

When industrial designers plan the design of a new element of a vehicle’s propulsion or exterior, they typically use fluid dynamics to optimize airflow and in Read more…

By Oliver Peckham

John Martinis Reportedly Leaves Google Quantum Effort

April 21, 2020

John Martinis, who led Google’s quantum computing effort since establishing its quantum hardware group in 2014, has left Google after being moved into an advi Read more…

By John Russell

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

Google Cloud Debuts 16-GPU Ampere A100 Instances

July 7, 2020

On the heels of the Nvidia’s Ampere A100 GPU launch in May, Google Cloud is announcing alpha availability of the A100 “Accelerator Optimized” VM A2 instance family on Google Compute Engine. The instances are powered by the HGX A100 16-GPU platform, which combines two HGX A100 8-GPU baseboards using... Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This