Engineers at IBM have developed a fully integrated wavelength multiplexed silicon photonics chip, which the company says will soon enable manufacturing of 100 Gb/s optical transceivers. The advance promises to offer a more economical way to move the huge amounts of data required for cloud computing and big data applications.
It’s a significant milestone for silicon photonics technology, says IBM, referring to the method of using pulses of light instead of electrical signals over copper wires to transfer large volumes of data at very high speed between computer chips.
The IBM design allows optical components to be integrated alongside traditional electrical circuits on a single silicon chip using sub-100nm semiconductor technology. The company estimates that the new transceiver can pass off 63 million tweets or six million images in just one second, and enable an entire high-definition digital movie to be downloaded in just two seconds.
Although the release focused more on the enterprise and consumer space, integrated silicon photonics addresses heat and power issues that face HPC as it heads to exascale and beyond.
The IBM tech is apparently fab-friendly too:
“IBM’s new CMOS Integrated Nano-Photonics Technology…makes use of standard fabrication processes at a silicon chip foundry, making this technology ready for commercialization.”
Towards a Practical Quantum Computer
The photonics advance comes on the heels of another circuit breakthrough from Big Blue, which moves the needle towards the holy grail that is quantum computing. For the first time, IBM researchers have demonstrated the ability to simultaneously detect and measure bit-flip and phase-flip quantum errors, a feature they say is critical for “any real quantum computer.” They also revealed a new, square quantum bit circuit design that the researchers believe is “the only physical architecture that could successfully scale to larger dimensions.”
The square lattice structure, which consists of four superconducting qubits on a one-quarter-inch square chip, is the innovation that enables researchers to detect both kinds of quantum errors simultaneously. When the qubits are organized in a linear array arrangement, this critical functionality is lost.
“Up until now, researchers have been able to detect bit-flip or phase-flip quantum errors, but never the two together. Previous work in this area, using linear arrangements, only looked at bit-flip errors offering incomplete information on the quantum state of a system and making them inadequate for a quantum computer,” said Jay Gambetta, a manager in the IBM Quantum Computing Group. “Our four qubit results take us past this hurdle by detecting both types of quantum errors and can be scalable to larger systems, as the qubits are arranged in a square lattice as opposed to a linear array.”
IBM expressed confidence that this design can be scaled by adding more qubits to achieve a working quantum system.
“Quantum computers promise to open up new capabilities in the fields of optimization and simulation simply not possible using today’s computers,” remarked a company press release. “If a quantum computer could be built with just 50 quantum bits (qubits), no combination of today’s TOP500 supercomputers could successfully outperform it.”
The research, which was partly funded by IARPA (Intelligence Advanced Research Projects Activity), is described in the April 29 issue of the journal Nature Communications.