For much of the history of aviation, designers and engineers ran their calculations and experiments without the benefit of computers. But as the price of computation dropped, the competitive advantage conferred by digital engineering became clear. The aerospace industry has reached a turning point where the complexity of the simulation work is necessitating an increase in the adoption of high performance computing.
To wit, an article in Aviation Week points out that aerospace does not make a strong showing in the TOP500 list of the world’s fastest supercomputers. “NASA’s Pleiades at Ames Research Center is ranked 21st, well behind the fastest machine, China’s Tianhe-2,” the author writes. “The Air Force Research Laboratory’s Spirit is 24th and the highest-ranked supercomputer owned by a manufacturer is Airbus’s HPC4, at 72 on the list.” (Another Airbus system earned a 196th ranking with 243.9 teraflops.)
The compute power of these machines ranges from 517 teraflops (HPC4) to 1.42 petaflops (Spirit) to 1.54 petaflops (Pleiades). By contrast, the world’s fastest sytem, Tianhe-2 in China, has a benchmarked performance of nearly 33,900 teraflops (33.9 petaflops), and the second fastest system, the US Oak Ridge National Laboratory’s Titan achieved a 17.6 petaflops LINPACK.
The article cites a March report on the future of computational fluid dynamics put out by the U.S. National Research Council that says new design codes and access to more powerful machines are needed to “tackle the challenges of fully simulating turbulent, separated flow over aircraft or off-design operation of engines.”
In order to continue supporting advanced science and industry workloads, vendors and their academic partners are striving to develop next-generation supercomputers that are about 100 times faster than today. These are expected to be massively parallel systems with an exaflop or more computing power.
While current techniques and architectures can likely be exploited for one more supercomputer generation, it is widely accepted that beyond-exascale computing will require the advent of new technologies. Potential post-silicon candidates include quantum computing, superconducting, molecular and neuromorphic computing.
Lockheed Martin is one of the aerospace manufacturer’s considering some of these alternatives. In 2010, Lockheed Martin purchased the “first commercially available quantum computer” from Canadian startup D-Wave. The 512-qubit D-Wave 2 is installed at the University of Southern California (USC). A second D-Wave system was sold in 2013 to partners Google, NASA and the Universities Space Research Association (USRA), who jointly founded the Quantum Artificial Intelligence Lab to explore applications for the machine. This week, Google announced its intention to develop superconducting-based quantum processors via a partnership with the University of California, Santa Barbara (UCSB).
The D-Wave machine has been criticized for not being a fully-functioning “universal” quantum computer. In fact, the system is an “adiabatic” computer that uses quantum annealing to solve a specific set of applications called optimization problems. Lockheed is using D-Wave for verification and validation of software, which grows more time-consuming and costly as systems become more complex. Testing adaptive, non-deterministic software is another potential use case.
Lockheed is also working with University of Maryland researchers on an integrated quantum computing platform. The parties signed a memorandum of understanding establishing the Quantum Engineering Center at the University of Maryland, College Park in March. The goal of the collaboration is to develop a reliable quantum platform that is as easy to operate as conventional computers are today.
“In the case of quantum components, it’s like we’re back in 1947 working with the first semiconductor transistors,” said University of Maryland physics professor Dr. Chris Monroe in the official announcement. “We are talking about unusual systems— specially tuned laser and microwave fields trained with exquisite precision onto individual atoms suspended with electrical fields and immersed in a vacuum chamber a million times less dense than outer space. Each aspect is challenging in its own way, but we understand exactly how every piece works. Our focus now is integrating these systems to consistently and reliably work in harmony, much like engineering a complex aircraft, so that the device is more than just a sum of its parts.”