Computing with light?

By Anne Stuart

March 10, 2006

Light is the solution. It's also the problem.

That's the paradox HP Labs' Quantum Information Processing Group is beginning to unravel with its research into optical quantum computing.

The group has been investigating ways to use photons, or light particles, for information processing, rather than the electrons used in digital electronic computers today. Their work holds promise for someday developing faster, more powerful and more secure computer networks.

“Quantum processing can attack problems we can't attack with conventional computers,” says Tim Spiller, the HP Distinguished Scientist who is leading the research. “Even a small quantum computer has the potential to enhance communications and information processing.”

Previous designs impractical

Today's computers work by manipulating bits that exist as either 0s or 1s. What makes quantum computing so powerful is that quantum bits (or qubits) have an infinite choice of values, meaning they can potentially perform multiple operations simultaneously. For example, a quantum computer could efficiently factor large numbers that today's, or even tomorrow's, conventional machines might never be able to crack.

Scientists have believed for years that high-speed light is the best candidate for moving quantum information from place to place. However, previous designs for quantum computing using light have been extremely inefficient, and so completely impractical for actual technology.

Challenges in working with light

“Light is very good for communication because the bits of lights don't talk to each other,” Spiller explains. “You can send light over long distances — for instance, with optic fibers — and it preserves its state pretty well. It doesn't communicate with other bits of light — or with much else, either. That's why you can have many different conversations going on at the same time in the same telephone cable and they don't interfere with each other.”

Therein lies the problem. “To do any kind of data processing, the bits of data need to be able to interact,” as they do in today's computer systems, Spiller says. “So on the face of it, light isn't good for information processing because the bits of light don't talk to each other. We need a process to get pieces of light at the quantum level to talk to each other.”

That's exactly what his team — including Principal Research Scientist Bill Munro, other HP Labs researchers and Kae Nemoto, an associate professor of quantum information sciences at the Tokyo-based National Institute of Informatics — has spent two years trying to develop.
 
Photon detection method a breakthrough

Their starting point was solving yet another puzzle: how to detect photons, or individual chunks of light, without absorbing or damaging them. Typically, detecting a single photon requires letting it “smack into” something, such as a piece of semiconductor material, Spiller says. “That creates a lot of electrons and holes, and the piece of light is lost. So you can detect it, but in the process of doing so, it's destroyed.”

Now the team has developed a method for both detecting photons without appearing to harm them and for allowing the bits of light to communicate with each other by having the photons 'talk' with one another via a probe light signal. The photon leaves an imprint of itself on the probe light signal without being damaged in the process, allowing the researchers to detect it without apparently demolishing it.

“You can measure the probe light signal and look for the photon's imprint,” Spiller explains. “If you see it, it's there; if you don't, it isn't.”

Better communication, more security

Equally important: the probe light signal lets photons interact, albeit indirectly. “If you do a certain type of measurement on the probe after it's talked to two photons, you'll find that although they don't talk directly to each other, the photons have interacted because they both talk to the probe,” Spiller says.

Theoretically, the researchers say, there's no limit to the number of photons that can interact this way. For that reason, their work represents a quantum step toward creating a scalable method for optical quantum computing.

“The nice thing about this is it's moving toward having the best of both worlds,” Spiller says. “The best communication is done with light. If you can also compute with light, you can do everything with light.” As a result, there is no need to convert quantum information from some type of electronic format to the new optical one and back.

Work to be done

The HP Quantum Information Research effort circles the globe, with researchers in Bristol, Palo Alto and Tokyo (Nemoto). Their efforts, which could fundamentally change the way computers and people work, have caught the attention of scientists worldwide. Many call the team's early findings important and promising, but warn that researchers are years away from being able to build a real optical quantum computing system.

Spiller readily concurs. “Our vision is long-term, and we're starting small,” he says. We hope to have some experimental results in the next couple of years. Once we've got that started-once we know for sure how one photon can talk to one beam — we can go forward to build quantum processors.”

Potential applications

Quantum computing has the potential to revolutionize information technology. Small quantum processors containing just a few qubits could be used to stretch out the distances over which secure quantum communications work, analogous to the way that conventional optical repeaters are used to amplify ordinary optical communications. Such small processors may also enable new sensing and measurement technology.

Comparably small (in terms of qubit number), but distributed, quantum processors could enable new protocols such as secure quantum auctions between separated parties, or quantum voting.

Mid-sized quantum processors (50-100 qubits) could be used as research tools, allowing simulation of quantum systems that currently cannot be performed on even the most powerful supercomputers. Large quantum computers (with tens of thousands of qubits) will likely be able to search more quickly than conventional technology and factor very large numbers efficiently, leading to quantum code-breaking.

New paradigm for IT

Other potential applications are likely to evolve over time. Munro points out that transistors were initially used in hearing aids.

“At that time, you only had big valves made of glass so you couldn't have a small integrated circuit. Then somebody put a few components on a small device used for hearing aids,” he says. “The people who did that didn't foresee that down the road you'd have millions of transistors, or integrated circuits, on pieces of silicon, running devices of all types and sizes.”

Spiller says he expects that quantum computing will enhance, rather than replace, the current standard.

“We believe quantum computing will grow alongside conventional computing. You might have a quantum processor sitting next to your conventional machine,” he says. “It's not that quantum technology will sweep everything else away. Instead, it will enable new things.”

—–

This article is reprinted courtesy of Hewlett-Packard Company.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Mira Supercomputer Enables Cancer Research Breakthrough

November 11, 2019

Dynamic partial-wave spectroscopic (PWS) microscopy allows researchers to observe intracellular structures as small as 20 nanometers – smaller than those visible by optical microscopes – in three dimensions at a mill Read more…

By Staff report

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quantum annealing) – ion trap technology is edging into the QC Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. That’s the conclusion drawn by the scientists and researcher Read more…

By Jan Rowell

What’s New in HPC Research: Cosmic Magnetism, Cryptanalysis, Car Navigation & More

November 8, 2019

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

Machine Learning Fuels a Booming HPC Market

November 7, 2019

Enterprise infrastructure investments for training machine learning models have grown more than 50 percent annually over the past two years, and are expected to shortly surpass $10 billion, according to a new market fore Read more…

By George Leopold

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Atom by Atom, Supercomputers Shed Light on Alloys

November 7, 2019

Alloys are at the heart of human civilization, but developing alloys in the Information Age is much different than it was in the Bronze Age. Trial-by-error smelting has given way to the use of high-performance computing Read more…

By Oliver Peckham

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. Th Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed ins Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

Spending Spree: Hyperscalers Bought $57B of IT in 2018, $10B+ by Google – But Is Cloud on Horizon?

October 31, 2019

Hyperscalers are the masters of the IT universe, gravitational centers of increasing pull in the emerging age of data-driven compute and AI.  In the high-stake Read more…

By Doug Black

Cray Debuts ClusterStor E1000 Finishing Remake of Portfolio for ‘Exascale Era’

October 30, 2019

Cray, now owned by HPE, today introduced the ClusterStor E1000 storage platform, which leverages Cray software and mixes hard disk drives (HDD) and flash memory Read more…

By John Russell

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This