Astrophysicists at the CEA (the French Alternative Energies and Atomic Energy Commission) and CNRS (the French National Center for Scientific Research) have achieved a major breakthrough. Thanks to a set of highly precise supercomputer simulations, the scientists have a much keener understanding of the turbulence that is generated when two galaxies collide. The study used very high resolution numerical simulations in which the disordered motions of the gas contained in galaxies is seen at extremely small-scale resolutions.
Appearing in the May 2014 edition of the Monthly Notices of the Royal Astronomical Society, Letters, the study resolves a long-standing cosmic mystery, a phenomenon called “starbursts.” Stars form when gas that is contained in a galaxy becomes dense enough to collapse in on itself, most often as a result of gravity. As they form, they emit very intense ultraviolet and infrared light. When two galaxies collide, the effect is multiplied and a large number of stars blink into existence. The peak emission of light that results is called a “starburst.”
Astrophysicists had observed such galactic collision light shows before, but could not explain why the stars formed. When galaxies collide, the galactic gas becomes more disordered, and the vortices of turbulence that result should theoretically prevent the gas from condensing due to gravity. In this scenario, turbulence would actually slow down, putting the breaks on star formation. The reasoning seemed solid, except for the fact that it the exact opposite of what actually happens. For the first time, the new simulations, some of the most sophisticated sky studies yet, fill in the missing details.
The simulations show that the collision changes the nature of the turbulence at a very small scale. The vortex effect is replaced by a gas compressive mode that enables the turbulence to facilitate the collapse of the gas by compressing it. This compressive turbulence effect sets off an excess of dense gas that causes multiple stars to form all throughout the galaxies. Compressive turbulence not only explains the mystery of star formation, it also sheds light on why some galaxies form more stars than others.
The research would not have been possible without the assistance of some of the most powerful supercomputers in the world. These high resolution models – which represent two real-life galaxies: the Milky Way and the “Antennae Galaxies” – employed two supercomputers that are part of the European research infrastructure, PRACE: GENCI’s Curie supercomputer, housed at the CEA’s Computing Center, and the SuperMUC supercomputer, located in Leibniz-Garching, Germany.
The Milky Way simulation, carried out on the Curie supercomputer, covered a span of about 300,000 light-years, with a resolution of 0.1 light-year. This part of the study used the equivalent of 12 million computing hours over a period of 12 months.
The galactic collision was simulated on the SuperMUC supercomputer, which has 4,096 processors running in parallel. It took 8 million computing hours and a period of eight months to simulate a cube that is 600,000 light-years on each side, with resolution of 3 light-years. According to CEA officials, these are the most realistic simulations to date of the observed events.
“These new simulations have achieved a level of precision never seen before, making it possible to resolve structures with a mass 1,000 times smaller than ever before,” notes a press release. “This has enabled the astrophysicists to track the evolution of the galaxies over hundreds of thousands of light-years, and to explore a mere fraction of a light-year in detail. Thanks to this decisive advantage, new physical effects emerged, revealing the complex nature of turbulence.”