THE DANCE OF TWO BLACK HOLES

October 27, 2000

by Michael Schneider, Pittsburgh Supercomputing Center

Pittsburgh, PA. — Once upon a time, on a small planet in a galaxy called the Milky Way, black holes were considered a fascinating theoretical artifact from the mathematics of general relativity – interesting concept, great stuff for science fiction. We’ve come a long way since 1915 when Einstein laid out his theory that rocked our world.

In 1969, American physicist John Wheeler coined the phrase that gives resonance to the concept of points in space-time where matter is so condensed, gravity so fiercely omnivorous, that it swallows everything, including light, that gets too close. Only 12 years ago, with observational evidence beginning to trickle in – swirling gas and star coalescence at the center of galaxies – Stephen Hawking wrote, prophetically, in A Brief History of Time: “The number of black holes may well be greater even than the number of visible stars.”

Since 1994, the Hubble Space Telescope and, more recently, NASA’s Chandra X-ray Observatory have convincingly lifted black holes from theory into reality. With data from these eyes in space, scientists have identified over 30 likely black holes and counting. They come in a range of sizes, from supermassive (like the monster with the mass of 30 million suns at the center of the Andromeda galaxy) to many that are small ( a few solar masses) and most recently a middleweight (about 500 solar mass) in galaxy M82.

Still, even with Hubble and Chandra, the evidence is circumstantial. Fundamentally, a black hole is invisible. Looking for one, as Hawking said, is like trying to find a black cat in a coal cellar. The observations offer reasoned surmises about an undetectable agent lurking in the interior of detectable phenomena. As Penn State astrophysicist Pablo Laguna and post-doctoral fellow Deirdre Shoemaker like to point out, the way to clinch, indisputably, that black holes exist and that Einstein’s equations are right is to detect gravity waves from two black holes.

Detecting gravity waves is the job, a big one, cut out for LIGO, Virgo and GEO600. LIGO (Laser Interferometer Gravitational-Wave Observatory) is two NSF-funded gravity-wave detectors – in Louisiana and Hanford, Washington – now undergoing testing. Virgo and GEO600 are under construction in Europe (near Pisa, Italy and in Germany). Together these projects represent a pioneering effort that scientists hope will lead the way to an invaluable new set of eyes – gravity eyes – for seeing the universe. But it won’t be easy, especially since no one ever has detected a gravity wave.

Along with anticipating black holes, Einstein’s theory predicts that accelerating movements of massive objects in space, such as supernova explosions and black holes, will produce ripples traveling at light-speed through space-time. As with black holes, there’s indirect evidence he was right, but compared to other wave phenomena, like electromagnetism, which brings us radio and TV, gravity waves are very weak. Einstein speculated they might never be detected. If you think of LIGO as the gigantic antenna for a radio receiver, the strongest possible signal might be a faint crackle as you turn the dial. To improve the chances of hearing the first crackle of gravity from the cosmos, LIGO needs to know where to set the dial to tune in two black holes colliding with each other.

To do this, researchers like Laguna and Shoemaker are using supercomputers, the most powerful they can find, to numerically solve Einstein’s equations. Their field is called numerical relativity, and with collaborators at the University of Texas and the University of Pittsburgh, Penn State has assembled one of the leading groups in the world. In recent work, relying on systems at PSC, at NCSA in Illinois and elsewhere, this multi-university team successfully simulated two black holes merging in what’s called a grazing collision – only the second time this has been accomplished. Their numerical approach, called black-hole excision, makes a notable dent in the two-black-hole problem, the major challenge of this challenging field.

“Einstein’s equations describe gravity via an elegant but complicated set of non-linear partial differential equations,” says Laguna. “Their complexity requires the most powerful supercomputers available. Accurately solving the two-black-hole problem, formulated conceptually by Einstein 80 years ago, will represent an historic moment in the development of general relativity theory, with extremely important implications for astrophysics and cosmology.”

The mathematics of a single spherical black hole sitting and spinning in space was worked out long ago by German astronomer Karl Schwarzschild, who in 1917 from his deathbed in effect discovered the black hole, without naming it, as one of the implications of Einstein’s theory. A single black hole by itself, however, doesn’t make gravity waves. Add another black hole, the interesting and many believe very relevant situation of two black holes merging with each other – often called a binary black hole – and you fiendishly complicate the mathematics, to the point where the only hope is supercomputers.

“As in most physical studies,” says Shoemaker, “you want to look at the complicated and more realistic situations to test what you know. With general relativity, you can’t put two of these compact objects together and get a solution without advanced computational techniques. Two black holes takes the theory into a dynamical regime, where you can make predictions and then, if experiments verify the predictions, you know how far the theory is correct.”

It’s a mutually beneficial relationship. To verify the predictions, you need detectors. LIGO, Virgo and GEO600, likewise, need predictions. Many believe that colliding black holes is the best shot at detecting gravity waves. Theory says it’s one of the strongest signals on the gravity-wave dial. To know if a crackle of static is the dance of two black holes or cosmic noise, the detectors need the answers numerical relativists are working to provide.

“Abandon hope, all ye who enter here,” said Dante of the entrance to Hell. He might have said the same about the event horizon of a black hole. In solving Einstein’s equations, Schwarzschild started with the idea of an infinitely condensed mass and showed that space-time curves around it and closes on itself. Once matter or light enters space-time within a certain radius from that point – initially called the Schwarzschild radius, now the event horizon – there’s no escape. The region inside the horizon is cut off from events outside. This principle, called cosmic censorship, underlies black-hole excision.

The killer for simulating black holes is the singularity, the point of infinite density and space-time curvature that, mathematically speaking, makes a black hole a black hole. “The most crucial aspect of numerically evolving spacetimes containing black holes,” says Laguna, “is without doubt the accurate and long-term handling of the singularities these objects represent.”

Simply put, the numbers get too big too fast, and the computation crashes. “If you get too far inside the black hole,” says Shoemaker, “you run into huge gradients that kill your calculations. There are basically two alternatives. In one of them you exploit the relativity of time; in effect you slow down how fast clocks tick near the black hole to avoid approaching that area. The other way is to remove the dangerous area. We did the latter.”

The first approach, avoiding the singularity, has been more popular, and a group at the Albert Einstein Institute near Berlin has employed it with some success. It has the drawback that to slow down time inevitably adds to the already severe computational demands. With software they call AGAVE, the Penn State-Pittsburgh-Texas team has taken the less-traveled road of surgically removing the singularity from the domain of the calculation. About two years ago, their Pittsburgh collaborators successfully excised the singularity for a single black hole moving in space. AGAVE extends this approach to colliding black holes, in effect, simulating two black holes without the black holes.

How, you might ask, can you compute gravity waves from a black hole if you eliminate the black hole? The secret, says Laguna, is in the horizon. Cosmic censorship. Since information about anything across that threshold is cut off, physical processes outside the horizon aren’t affected by what happens inside. “As long as the spacetimes with and without the singularities agree at the points where the cut is made,” says Laguna, “both situations should be equivalent for an observer outside.”

Much easier to say than implement, notes Shoemaker. The numerical intricacies of cutting out the hole from the grid-like domain of the computation and, at the same time, keeping track of its movement in time, are daunting. Using PSC’s CRAY T3E, AGAVE underwent extensive development and testing prior to the grazing collision simulation.

The grazing collision is a milestone – compared to the symmetry of a head-on crash, which has been done before – because it adds a layer of complexity and realism. With 40 processors of NCSA’s SGI Origin 2000, it required nearly 100 hours. There’s simplifying assumptions, such as two equal mass black holes, but the result is, you might say, a smashing success that pushes beyond prior work.

Excision tamed the numerical instabilities of the singularity long enough for the two black holes to merge completely and evolve for a short period as one large black hole before the simulation crashed. It’s not the end of the road by any means, stress Laguna and Shoemaker. There’s not yet accurate gravity-wave predictions to hand over to LIGO. But the next mountain now looks more climbable. That mountain, two black holes that orbit each other before they coalesce, is a few years away say the researchers.

Further help is coming, notes Laguna, whose eyes light up thinking of PSC’s new terascale system, more than 2,700 powerful processors with a peak capability of over six-trillion calculations per second, a leap forward that will allow the team to push further with AGAVE. “We believe one of the severe problems we have now is that the merged black hole gets too close to the boundaries of the computational domain. With the new machine, we can shift the outer boundary outward.”

Some day, not that far away, a crackle of static will come in from the cosmos. Was Einstein right? Are there really black holes? When two of these monsters swallow each other, does it create a tidal wave of gravity detectable on our tiny planet thousands or millions of light years away? Please place your bets now.

More information, including graphics: http://www.psc.edu/science/laguna.html

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing components with Intel Xeon, AMD Epyc, IBM Power, and Arm server ch Read more…

By Tiffany Trader

SIA Recognizes Robert Dennard with 2019 Noyce Award

November 12, 2019

If you don’t know what Dennard Scaling is, the chances are strong you don’t labor in electronics. Robert Dennard, longtime IBM researcher, inventor of the DRAM and the fellow for whom Dennard Scaling was named, is th Read more…

By John Russell

Leveraging Exaflops Performance to Remediate Nuclear Waste

November 12, 2019

Nuclear waste storage sites are a subject of intense controversy and debate; nobody wants the radioactive remnants in their backyard. Now, a collaboration between Berkeley Lab, Pacific Northwest National University (PNNL Read more…

By Oliver Peckham

Using HPC and Machine Learning to Predict Traffic Congestion

November 12, 2019

Traffic congestion is a never-ending logic puzzle, dictated by commute patterns, but also by more stochastic accidents and similar disruptions. Traffic engineers struggle to model the traffic flow that occurs after accid Read more…

By Oliver Peckham

Mira Supercomputer Enables Cancer Research Breakthrough

November 11, 2019

Dynamic partial-wave spectroscopic (PWS) microscopy allows researchers to observe intracellular structures as small as 20 nanometers – smaller than those visible by optical microscopes – in three dimensions at a mill Read more…

By Staff report

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quantum annealing) – ion trap technology is edging into the QC Read more…

By John Russell

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. Th Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed ins Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

Spending Spree: Hyperscalers Bought $57B of IT in 2018, $10B+ by Google – But Is Cloud on Horizon?

October 31, 2019

Hyperscalers are the masters of the IT universe, gravitational centers of increasing pull in the emerging age of data-driven compute and AI.  In the high-stake Read more…

By Doug Black

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This