THE DANCE OF TWO BLACK HOLES

October 27, 2000

by Michael Schneider, Pittsburgh Supercomputing Center

Pittsburgh, PA. — Once upon a time, on a small planet in a galaxy called the Milky Way, black holes were considered a fascinating theoretical artifact from the mathematics of general relativity – interesting concept, great stuff for science fiction. We’ve come a long way since 1915 when Einstein laid out his theory that rocked our world.

In 1969, American physicist John Wheeler coined the phrase that gives resonance to the concept of points in space-time where matter is so condensed, gravity so fiercely omnivorous, that it swallows everything, including light, that gets too close. Only 12 years ago, with observational evidence beginning to trickle in – swirling gas and star coalescence at the center of galaxies – Stephen Hawking wrote, prophetically, in A Brief History of Time: “The number of black holes may well be greater even than the number of visible stars.”

Since 1994, the Hubble Space Telescope and, more recently, NASA’s Chandra X-ray Observatory have convincingly lifted black holes from theory into reality. With data from these eyes in space, scientists have identified over 30 likely black holes and counting. They come in a range of sizes, from supermassive (like the monster with the mass of 30 million suns at the center of the Andromeda galaxy) to many that are small ( a few solar masses) and most recently a middleweight (about 500 solar mass) in galaxy M82.

Still, even with Hubble and Chandra, the evidence is circumstantial. Fundamentally, a black hole is invisible. Looking for one, as Hawking said, is like trying to find a black cat in a coal cellar. The observations offer reasoned surmises about an undetectable agent lurking in the interior of detectable phenomena. As Penn State astrophysicist Pablo Laguna and post-doctoral fellow Deirdre Shoemaker like to point out, the way to clinch, indisputably, that black holes exist and that Einstein’s equations are right is to detect gravity waves from two black holes.

Detecting gravity waves is the job, a big one, cut out for LIGO, Virgo and GEO600. LIGO (Laser Interferometer Gravitational-Wave Observatory) is two NSF-funded gravity-wave detectors – in Louisiana and Hanford, Washington – now undergoing testing. Virgo and GEO600 are under construction in Europe (near Pisa, Italy and in Germany). Together these projects represent a pioneering effort that scientists hope will lead the way to an invaluable new set of eyes – gravity eyes – for seeing the universe. But it won’t be easy, especially since no one ever has detected a gravity wave.

Along with anticipating black holes, Einstein’s theory predicts that accelerating movements of massive objects in space, such as supernova explosions and black holes, will produce ripples traveling at light-speed through space-time. As with black holes, there’s indirect evidence he was right, but compared to other wave phenomena, like electromagnetism, which brings us radio and TV, gravity waves are very weak. Einstein speculated they might never be detected. If you think of LIGO as the gigantic antenna for a radio receiver, the strongest possible signal might be a faint crackle as you turn the dial. To improve the chances of hearing the first crackle of gravity from the cosmos, LIGO needs to know where to set the dial to tune in two black holes colliding with each other.

To do this, researchers like Laguna and Shoemaker are using supercomputers, the most powerful they can find, to numerically solve Einstein’s equations. Their field is called numerical relativity, and with collaborators at the University of Texas and the University of Pittsburgh, Penn State has assembled one of the leading groups in the world. In recent work, relying on systems at PSC, at NCSA in Illinois and elsewhere, this multi-university team successfully simulated two black holes merging in what’s called a grazing collision – only the second time this has been accomplished. Their numerical approach, called black-hole excision, makes a notable dent in the two-black-hole problem, the major challenge of this challenging field.

“Einstein’s equations describe gravity via an elegant but complicated set of non-linear partial differential equations,” says Laguna. “Their complexity requires the most powerful supercomputers available. Accurately solving the two-black-hole problem, formulated conceptually by Einstein 80 years ago, will represent an historic moment in the development of general relativity theory, with extremely important implications for astrophysics and cosmology.”

The mathematics of a single spherical black hole sitting and spinning in space was worked out long ago by German astronomer Karl Schwarzschild, who in 1917 from his deathbed in effect discovered the black hole, without naming it, as one of the implications of Einstein’s theory. A single black hole by itself, however, doesn’t make gravity waves. Add another black hole, the interesting and many believe very relevant situation of two black holes merging with each other – often called a binary black hole – and you fiendishly complicate the mathematics, to the point where the only hope is supercomputers.

“As in most physical studies,” says Shoemaker, “you want to look at the complicated and more realistic situations to test what you know. With general relativity, you can’t put two of these compact objects together and get a solution without advanced computational techniques. Two black holes takes the theory into a dynamical regime, where you can make predictions and then, if experiments verify the predictions, you know how far the theory is correct.”

It’s a mutually beneficial relationship. To verify the predictions, you need detectors. LIGO, Virgo and GEO600, likewise, need predictions. Many believe that colliding black holes is the best shot at detecting gravity waves. Theory says it’s one of the strongest signals on the gravity-wave dial. To know if a crackle of static is the dance of two black holes or cosmic noise, the detectors need the answers numerical relativists are working to provide.

“Abandon hope, all ye who enter here,” said Dante of the entrance to Hell. He might have said the same about the event horizon of a black hole. In solving Einstein’s equations, Schwarzschild started with the idea of an infinitely condensed mass and showed that space-time curves around it and closes on itself. Once matter or light enters space-time within a certain radius from that point – initially called the Schwarzschild radius, now the event horizon – there’s no escape. The region inside the horizon is cut off from events outside. This principle, called cosmic censorship, underlies black-hole excision.

The killer for simulating black holes is the singularity, the point of infinite density and space-time curvature that, mathematically speaking, makes a black hole a black hole. “The most crucial aspect of numerically evolving spacetimes containing black holes,” says Laguna, “is without doubt the accurate and long-term handling of the singularities these objects represent.”

Simply put, the numbers get too big too fast, and the computation crashes. “If you get too far inside the black hole,” says Shoemaker, “you run into huge gradients that kill your calculations. There are basically two alternatives. In one of them you exploit the relativity of time; in effect you slow down how fast clocks tick near the black hole to avoid approaching that area. The other way is to remove the dangerous area. We did the latter.”

The first approach, avoiding the singularity, has been more popular, and a group at the Albert Einstein Institute near Berlin has employed it with some success. It has the drawback that to slow down time inevitably adds to the already severe computational demands. With software they call AGAVE, the Penn State-Pittsburgh-Texas team has taken the less-traveled road of surgically removing the singularity from the domain of the calculation. About two years ago, their Pittsburgh collaborators successfully excised the singularity for a single black hole moving in space. AGAVE extends this approach to colliding black holes, in effect, simulating two black holes without the black holes.

How, you might ask, can you compute gravity waves from a black hole if you eliminate the black hole? The secret, says Laguna, is in the horizon. Cosmic censorship. Since information about anything across that threshold is cut off, physical processes outside the horizon aren’t affected by what happens inside. “As long as the spacetimes with and without the singularities agree at the points where the cut is made,” says Laguna, “both situations should be equivalent for an observer outside.”

Much easier to say than implement, notes Shoemaker. The numerical intricacies of cutting out the hole from the grid-like domain of the computation and, at the same time, keeping track of its movement in time, are daunting. Using PSC’s CRAY T3E, AGAVE underwent extensive development and testing prior to the grazing collision simulation.

The grazing collision is a milestone – compared to the symmetry of a head-on crash, which has been done before – because it adds a layer of complexity and realism. With 40 processors of NCSA’s SGI Origin 2000, it required nearly 100 hours. There’s simplifying assumptions, such as two equal mass black holes, but the result is, you might say, a smashing success that pushes beyond prior work.

Excision tamed the numerical instabilities of the singularity long enough for the two black holes to merge completely and evolve for a short period as one large black hole before the simulation crashed. It’s not the end of the road by any means, stress Laguna and Shoemaker. There’s not yet accurate gravity-wave predictions to hand over to LIGO. But the next mountain now looks more climbable. That mountain, two black holes that orbit each other before they coalesce, is a few years away say the researchers.

Further help is coming, notes Laguna, whose eyes light up thinking of PSC’s new terascale system, more than 2,700 powerful processors with a peak capability of over six-trillion calculations per second, a leap forward that will allow the team to push further with AGAVE. “We believe one of the severe problems we have now is that the merged black hole gets too close to the boundaries of the computational domain. With the new machine, we can shift the outer boundary outward.”

Some day, not that far away, a crackle of static will come in from the cosmos. Was Einstein right? Are there really black holes? When two of these monsters swallow each other, does it create a tidal wave of gravity detectable on our tiny planet thousands or millions of light years away? Please place your bets now.

More information, including graphics: http://www.psc.edu/science/laguna.html

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire