Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags
March 3, 2014

Supercomputers Advance Understanding of Black Holes

Tiffany Trader
black hole image

Black holes, so fascinating to star-gazers of the professional and backyard variety, are definitely not empty as their name might imply. Quite to the contrary, they are exceedingly dense. According to NASA, these astronomic objects comprise a great amount of matter packed into a very small zone. It’s like a star ten times more massive than the Sun squeezed into an area the size of New York City. The result is a gravitational field so strong that nothing can escape, not even light.

Scientists can’t observe black holes by direct means, but they can infer the presence of black holes by scrutinizing their effect on the gas and matter that lies just outside their event horizon. They also generate heat and energy that gets radiated, in part, as light.

Knowledge about black holes has grown tremendously in recent years through indirect exploration methods, with researchers employing detailed numerical models and powerful supercomputers to simulate the complex dynamics that take place at the perimeter. To achieve accuracy, simulations of this complex scenario must account for numerous phenomenon including warped spacetime, gas pressure, ionizing radiation, magnetized plasma.

A team of astrophysicists, led by Scott Noble from Rochester Institute of Technology Rochester, created a new tool that predicts the light that an accreting black hole would produce. It works by modeling how photons hit gas particles in the disk around the black hole (also known as an accretion disk), generating light, specifically light in the X-ray spectrum, and producing signals that can be detected using ultra-powerful telescopes.

The researchers relied on powerful supercomputing resources from The Texas Advanced Computing Center (TACC) at The University of Texas at Austin to generate images of light signals from a black hole simulation. With this method and the computational power of the Ranger system (which was retired in early February), the researchers for the first time were able to explain nearly all the components seen in the X-ray spectra of stellar-mass black holes.

The ability to produce realistic light signals from a black hole simulation marks a new era for astrophysics. Based on the new techniques that were devised for this project, researchers will be able to explain numerous other observations taken with multiple X-ray satellites over the past 40 years.

It’s an exciting time for black hole researchers with each year revealing more details about their significance in shaping the cosmos.

“Nearly every good-sized galaxy has a supermassive black hole at its center,” said Julian Krolik, a professor of physics and astronomy at Johns Hopkins University. Over multi-million year periods, black holes accrete incredible amounts of gas. This equates to energy, a lot of energy. During one of these periods, a black hole can produce as much as 100 times the power output of all the stars in its host galaxy put together.

“Some of that energy can travel out into their surrounding galaxies as ionizing light or fast-moving jets of ionized gas,” Krolik added. “As a result, so much heat can be deposited in the gas orbiting around in those galaxies that it dramatically alters the way they make new stars. It’s widely thought that processes like this are largely responsible for regulating how many stars big galaxies hold.”

Ranger was a Sun Constellation system hosted by the Texas Advanced Computing Center that was operational from 2008 to 2013. In early 2015, TACC expects to welcome Wrangler, a new NSF-supported big-data driven system, which will join Stampede, one of the most powerful supercomputers in the world.