SCIENCE & ENGINEERING NEWS
San Diego, CALIF. — Jack Lucentini reports that when scientists released a map in June showing results of the biggest telescope survey of the universe to date, the map’s major features turned out to be pleasantly familiar to a group of cosmos computer engineers.
Those researchers had already created striking simulations of the universe’s large-scale structures on supercomputers – using, instead of telescopes, equations representing the laws of physics.
The real sky map, then, helped scientists identify which assumptions used for the simulations were most accurate. This in turn helped answer a host of questions about the origins and likely future of the universe.
In recent years, a data dialogue and exchange between observational and computational astronomy has yielded an unusual convergence and clarity among those trying to explain the mysteries of the universe.
Supercomputers, powerful machines that perform millions or billions of calculations per second, have become a crucial supporting tool for space research.
Researchers use the computers to study questions ranging from the big ones just mentioned to finer points, such as how galaxies, stars and planets formed. The field has come into its own since a handful of researchers started making the first simulations in the early 1980s.
“Right now, computational astrophysics is probably equal in size to observational astrophysics” as a field, said Mordecai-Mark Mac Low, assistant curator in the department of astrophysics at the American Museum of Natural History in New York.
But the limitations of the machines are frustrating scientists, just as they start to close in on answers. Computers capable of making truly realistic simulations are still several years away, experts say.
The simulations used for comparison with the recent galaxy map illustrate both the triumphs and the shortcomings of the technology.
An international collaboration of researchers called the Virgo Consortium performed the simulations using a 512-processor Cray supercomputer at the Max Planck Society in Garching, Germany.
In 1998, the group released the largest cosmological simulation to date. It showed how tiny ripples in space, left over from the original explosion that created the universe roughly 13 billion years ago, became giant galaxies clusters under gravity’s influence.
The simulations showed galaxies gathering into a vast, cobweb-like network spanning the observable universe. The structures were strikingly similar to those shown by the galaxy map released two years later, by British, Australian and U.S. researchers in a project called the Two-Degree Field Galaxy Redshift Survey. The simulations were considered invaluable in explaining the observed results.
“It’s only through those simulations that you can understand how the tiny little ripples in the early universe can grow into galaxies,” said John M. Blondin, associate professor of theoretical astrophysics at the University of North Carolina, who was not involved with the projects.
The simulations assumed a universe with an invisible energy causing it to expand ever more quickly. This factor, called the cosmological constant, had been suggested by observations of distant exploding stars two years earlier.
The comparisons between the simulations and the actual sky as revealed by the mapping project strongly bolstered the evidence for this energy. This indicates that our universe may well expand forever, eventually becoming a cold, nearly empty void as the stars die out one by one.
Similar computations were conducted in the United States by the Grand Challenge Cosmology Consortium, a collaboration of researchers from Princeton, the Massachusetts Institute of Technology, and several other universities.
This team, working at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign and the Pittsburgh Supercomputing Center, succeeded in explaining a riddle that had puzzled scientists for decades.
In 1997, the group simulated Lyman Alpha Clouds. These are vast seas of hydrogen gas that appeared to float between the galaxies. Researchers had been mystified as to how they avoided being sucked into the galaxies by gravity.
Over the years, “Various ad hoc models were proposed for how they stayed as separate entities,” said Renyue Cen, a senior research astrophysicist at Princeton University and a team leader with the Grand Challenge group.
“We are the first group to have done the cosmological simulation of these clouds without assuming any ad hoc physical conditions, and showing that these clouds were actually natural products of the molecular structure of the [early] universe,” he explained.
The simulations backed up a theory that the universe is made up mostly of “cold, dark-matter” – oceans of slowly moving particles which are undetectable by our telescopes. Scientists increasingly believe this stuff is the dominant feature of our universe.
But one limitation of such whole-universe supercomputer simulations is that they failed to resolve the small-scale structure of the universe, individual galaxy clusters and galaxies. The computational power wasn’t there. Explanations of the universe thus remain incomplete.
Galaxy simulations have been done separately. The American Museum of Natural History, for instance, is putting together a supercomputer called GRAPE 6 that will calculate the orbits of galaxies and stars.
The machine, which museum officials say will be the fastest computer in the world when it is completed in 2001, will help answer the still murky questions of how stars and galaxies formed.
But the GRAPE 6 performs a relatively simple calculation repeatedly – the law of gravity. More complicated is the task of calculating the several, or dozens, of factors that affect the real universe: radiation, temperature, shock waves and more.
And that requires tying together the large-scale structures of the universe with the smaller-scale pieces such as galaxies, Mac Low said. This is the real sticking point of current supercomputer technology, he said.
The reason, he explained, is that while it has been relatively simple to make machines with increasing computing power, it is much harder to make the growing numbers of processors – the computers’ “brains” – communicate effectively. The next few years of research will be along those lines, he predicted.
“Modern galaxy-formation models are developing galaxies that are much too big. The missing piece of physics may be shock waves” from exploding stars, he explained, for which current simulations don’t account. “That’s the tie back to the larger scale.”