CQuIC Names New Director

December 6, 2018

Dec. 6, 2018 — The University of New Mexico’s Center for Quantum Information and Control (CQuIC) recently named Professor Ivan Deutsch as the Center’s new director. Deutsch was named to the position when former Director and Professor Carlton Caves retired.

CQuIC is a college-wide center in the College of Arts & Sciences that is also supported by a center-wide grant from the National Science Foundation (NSF). CQuIC was established in 2009 as a followup to the Center for Advanced Studies (CAS) that pioneered quantum optics in the 1980s.

CQuIC’s focus is on the control of complex quantum systems, with an overarching goal of making quantum systems march to the researchers’ orders instead of doing what comes naturally. Such control is an essential part of the “Second Quantum Revolution,” which promises a radically new paradigm for information processing tasks such as communications, computing, and sensing. Quantum Information Science (QIS) also provides a new framework for understanding the fundamental workings of the universe such as the mysteries of black holes.

Deutsch, a Regents’ Professor in the Department of Physics and Astronomy, came to UNM in 1995 and joined Caves to establish a group in what was then, the brand new field of QIS.

“Carl called us the Information Physics Group, at the intersection between information theory and physics,” said Deutsch. “QIS didn’t even have a name at that time. The quantum part of it had just kind of burst onto the scene in 1994 with a big discovery of a quantum algorithm that could break all the secret codes that encode everybody’s credit card numbers and every secret on the Internet.”

Since then, UNM and National Laboratories in New Mexico, have been real pioneers in the field of QIS. The standard textbook in the field was written here, so New Mexico itself plays a really central role in the whole development of QIS. Currently, CQuIC works closely with two local partners, Sandia National Laboratories and Los Alamos National Laboratory as well as the University of Arizona, and other international partners.

CQuIC’s research is organized into four main research areas: quantum information and computationquantum control and measurementquantum metrology, and quantum optics and communication. CQuIC has extensive theoretical and experimental research programs in all these areas. CQuIC took part in a big competition for a large center-wide grant in 2016. It received the renewal from the National Science Foundation (NSF), and now leads one of the biggest efforts in the country in QIS research.

“As part of the renewal, CQuIC was made into an NSF Focused a Research Hub in Theoretical Physics (FRHTP), one of only two in the country,” Deutsch proudly points out. “We are the FRHTP for QIS. The Hub is designed for the professional development of postdoctoral researchers. This turbocharges our Center, which impacts student researchers as well.”

CQuIC has received new awards, including a share of the $15 million NSF initiative dubbed the Software-Tailored Architecture for Quantum co-design (STAQ) project, an effort that seeks to demonstrate a quantum advantage over traditional computers within five years using ion trap technology. The program is the NSF’s largest quantum computing effort to date. Also announced recently, CQuIC has received 2 out of 8 in the world, prestigious Google Focus Award gifts to help develop their new quantum computer.

CQuIC currently finds itself in a very unique point in its history. The field of QIS has grown by leaps and bounds around the world, and there’s plenty of interest and excitement in the Second Quantum Revolution. The rise of a nascent quantum industry is one of the really big stories.

“IBM, Microsoft, Google, Intel, and other big industrial giants are now making big investments in QIS,” said Deutsch. “This has become a bit of an arms race and there’s a lot of international competition, in particular with China who is investing heavily in this area, on the order of $10 billion.

“This international competition together with the rise of a quantum industry has gotten the attention of Washington D.C., and so there’s currently bipartisan legislation in Congress, called the National Quantum Initiative (NQI) Act.”

Deutsch said the NQI Act has passed the House and is currently pending in the Senate. The bill calls for increased investments in quantum information science, with an immediate injection of  $1.3 billion in the next five years. The NSF as well as the Department of Energy Laboratories, including Sandia and Los Alamos, stand to be a big part of the increased spending if the legislation is passed. Deutsch says that one of the key components of the NQI is workforce development and interdisciplinary training in QIS. UNM is poised to play a major role.

Deutsch is focusing his efforts so that UNM will be able to participate and capitalize on the NQI, given CQuIC’s early lead in QIS. It means growing QIS in other departments beyond physics, to include math, chemistry, electrical and computer engineering, and computer science, and also to take advantage of exciting new capabilities at the Center for High Technology Materials (CHTM). CHTM has a separate, related research program in Quantum Materials, which will strengthen UNM’s overall position in quantum technologies. Collaboration between the College of Arts and Sciences and the School of Engineering with be critical for the overall program in QIS.

“UNM is really at the forefront in QIS, building on CQuIC’s pioneering efforts,” said Deutsch. “My goal is to build a strong interdisciplinary program in quantum information science and technology across departments and colleges and to keep that momentum going.”


Source: UNM

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Nvidia Leads Alpha MLPerf Benchmarking Round

December 12, 2018

Seven months after the launch of its AI benchmarking suite, the MLPerf consortium is releasing the first round of results based on submissions from Nvidia, Google and Intel. Of the seven benchmarks encompassed in version Read more…

By Tiffany Trader

Neural Network ‘Synapse’ Technology Showcased at IEEE Meeting

December 12, 2018

There’s nice snapshot of advancing work to develop improved neural network “synapse” technologies posted yesterday on IEEE Spectrum. Lower power, ease of use, manufacturability, and performance are all key paramete Read more…

By John Russell

IBM, Nvidia in AI Data Pipeline, Processing, Storage Union

December 11, 2018

IBM and Nvidia today announced a new turnkey AI solution that combines IBM Spectrum Scale scale-out file storage with Nvidia’s GPU-based DGX-1 AI server to provide what the companies call the “the highest performance Read more…

By Doug Black

HPE Extreme Performance Solutions

AI Can Be Scary. But Choosing the Wrong Partners Can Be Mortifying!

As you continue to dive deeper into AI, you will discover it is more than just deep learning. AI is an extremely complex set of machine learning, deep learning, reinforcement, and analytics algorithms with varying compute, storage, memory, and communications needs. Read more…

IBM Accelerated Insights

4 Ways AI Analytics Projects Fail — and How to Succeed

“How do I de-risk my AI-driven analytics projects?” This is a common question for organizations ready to modernize their analytics portfolio. Here are four ways AI analytics projects fail—and how you can ensure success. Read more…

Is Amazon’s Plunge into Server Chips a Watershed Moment?

December 11, 2018

For several years now the big cloud providers – Amazon, Microsoft Azure, Google, et al – have been transforming from technology consumers into technology creators in hardware and software. The most recent example bei Read more…

By John Russell

Nvidia Leads Alpha MLPerf Benchmarking Round

December 12, 2018

Seven months after the launch of its AI benchmarking suite, the MLPerf consortium is releasing the first round of results based on submissions from Nvidia, Goog Read more…

By Tiffany Trader

IBM, Nvidia in AI Data Pipeline, Processing, Storage Union

December 11, 2018

IBM and Nvidia today announced a new turnkey AI solution that combines IBM Spectrum Scale scale-out file storage with Nvidia’s GPU-based DGX-1 AI server to pr Read more…

By Doug Black

Is Amazon’s Plunge into Server Chips a Watershed Moment?

December 11, 2018

For several years now the big cloud providers – Amazon, Microsoft Azure, Google, et al – have been transforming from technology consumers into technology cr Read more…

By John Russell

Mellanox Uses Univa to Extend Silicon Design HPC Operation to Azure

December 11, 2018

Call it a corollary to Murphy’s Law: When a system is most in demand, when end users are most dependent on the system performing as required, when it’s crunch time – that’s when the system is most likely to blow up. Or make you wait in line to use it. Read more…

By Doug Black

Topology Can Help Us Find Patterns in Weather

December 6, 2018

Topology--the study of shapes--seems to be all the rage. You could even say that data has shape, and shape matters. Shapes are comfortable and familiar concepts, so it is intriguing to see that many applications are being recast to use topology. For instance, looking for weather and climate patterns. Read more…

By James Reinders

Zettascale by 2035? China Thinks So

December 6, 2018

Exascale machines (of at least a 1 exaflops peak) are anticipated to arrive by around 2020, a few years behind original predictions; and given extreme-scale performance challenges are not getting any easier, it makes sense that researchers are already looking ahead to the next big 1,000x performance goal post: zettascale computing. Read more…

By Tiffany Trader

Robust Quantum Computers Still a Decade Away, Says Nat’l Academies Report

December 5, 2018

The National Academies of Science, Engineering, and Medicine yesterday released a report – Quantum Computing: Progress and Prospects – whose optimism about Read more…

By John Russell

Revisiting the 2008 Exascale Computing Study at SC18

November 29, 2018

A report published a decade ago conveyed the results of a study aimed at determining if it were possible to achieve 1000X the computational power of the the Read more…

By Scott Gibson

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

IBM at Hot Chips: What’s Next for Power

August 23, 2018

With processor, memory and networking technologies all racing to fill in for an ailing Moore’s law, the era of the heterogeneous datacenter is well underway, Read more…

By Tiffany Trader

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

AMD Sets Up for Epyc Epoch

November 16, 2018

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

TACC’s ‘Frontera’ Supercomputer Expands Horizon for Extreme-Scale Science

August 29, 2018

The National Science Foundation and the Texas Advanced Computing Center announced today that a new system, called Frontera, will overtake Stampede 2 as the fast Read more…

By Tiffany Trader

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

Intel Confirms 48-Core Cascade Lake-AP for 2019

November 4, 2018

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

Google Releases Machine Learning “What-If” Analysis Tool

September 12, 2018

Training machine learning models has long been time-consuming process. Yesterday, Google released a “What-If Tool” for probing how data point changes affect a model’s prediction. The new tool is being launched as a new feature of the open source TensorBoard web application... Read more…

By John Russell

The Convergence of Big Data and Extreme-Scale HPC

August 31, 2018

As we are heading towards extreme-scale HPC coupled with data intensive analytics like machine learning, the necessary integration of big data and HPC is a curr Read more…

By Rob Farber

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This