A Dark Matter for Astrophysics Research

By Nicole Hemsoth

May 31, 2011

Back in 2008, the Sloan Digital Sky Survey (SDSS) came to an end, leaving behind hundreds of terabytes of publicly-available data that has since been used in a range of research projects. Based on this data, researchers have been able to discover distant quasars powered by supermassive black holes in the early universe, uncover collections of sub-stellar objects, and have mapped extended mass distributions around galaxies with weak gravitational fields.

Among the diverse groups of scientists tackling problems that can now be understood using the SDSS data is a team led by Dr. Risa Wechsler from Stanford University’s Department of Physics and the SLAC National Accelerator Laboratory.

Wechsler is interested in the process of galaxy formation, the development of universal structure, and what these can tell us about the fundamental physics of the universe. Naturally, dark energy and dark matter enter the equation when one is considering galactic formation and there are few better keys to probing these concepts than data generated from the SDSS.

Just as the Sloan Digital Sky Survey presented several new data storage and computational challenges, so too do the efforts to extract meaningful discoveries. Teasing apart important information for simulations and analysis generates its own string of terabytes on top of the initial SDSS data. This creates a dark matter of its own for computer scientists as they struggle to keep pace with ever-expanding volumes that are outpacing the capability of the systems designed to handle them.

Wechsler’s team used the project’s astronomical data to make comparisons in the relative luminosity of millions of galaxies to our own Milky Way. All told, the project took images of nearly one-quarter of the sky, creating its own data challenges. The findings revealed that galaxies with two satellites that are nearby with large and small Magellanic clouds are highly unique — only about four percent of galaxies have similarities to the Milky Way.

To arrive at their conclusions, the group downloaded all of the publicly available Sloan data and began looking for satellite galaxies around the Milky Way, combing through about a million galaxies with spectroscopy to select a mere 20,000 with luminosity similar to that of our own galaxy. With these select galaxies identified, they undertook the task of mining those images for evidence of nearby fainter galaxies via a random review method. As Wechsler noted, running on the Pleiades supercomputer at NASA Ames, it took roughly 6.5 million CPU hours to run a simulation of a region of the universe done with 8 billion particles, making it one of the largest simulations that has ever been done in terms of particle numbers. She said that when you move to smaller box sizes it takes a lot more CPU time per particle because the universe is more clustered on smaller scales.

Wechsler described the two distinct pipelines required for this type of reserach. First, there’s the simulation in which researchers spend time looking for galaxies in a model universe. Wechsler told us that this simulation was done on the Pleiades machine at Ames across 10,000 CPUs. From there, the team performed an analysis of this simulation, which shows the evolution of structure formations on the piece of the universe across its entire history of almost 14 billion years — a process that involves the examination of dark matter halo histories across history. As she noted, the team was “looking for gravitationally bound clumps in that dark matter distribution; you have a distribution of matter at a given time and you want to find the peaks in that density distribution since that is where we expect galaxies to form. We were looking for those types of peas across the 200 snapshots we tool to summarize that entire 14 billion year period.”

The team needed to understand the evolutionary processes that occurred between the many billions of years captured in 200 distinct moments. This meant they had to trace the particles from one snapshot to the next in their clumps, which are called dark matter halos. Once the team found the halos, which again, are associated with galaxy formation, they did a statistical analysis that sought out anything that looked like our own Milky Way. Wechsler told is that “the volume of the simulation was comparable to the volume of the data that we were looking at. Out of the 8 million or so total clumps in our simulation we found our set of 20,000 that looked like possibilities to compare to the Milky Way. By looking for fainter things around them — and remember there are a lot more faint things than bright ones — we were looking for many, many possibilities at one time.”

The computational challenges are abundant in a project like this Wechsler said. Out of all bottlenecks, storage has been the most persistent, although she noted that as of now there are no real solutions to these problems.

Aside from bottlenecks due to the massive storage requirements, Wechsler said that the other computational challenge was that even though this project represented one of the highest resolution simulations at such a volume, they require more power. She said that although they can do larger simulation in a lower resolution, getting the full dynamic range of the calculation is critical. This simulation breaks new ground in terms of being able to simulate Magellenic cloud size objects over a large volume, but it’s still smaller than the volume that the observations are able to probe. This means that scaling this kind calculation up to the next level is a major challenge, especially as Wechsler embarks on new projects.

“Our data challenges are the same as those in many other fields that are tackling multiscale problems. We have a wide dynamic range of statistics to deal with but what did enable us to do this simulation is being able to resolve many small objects in a large volume. For this and other research projects, having a wide dynamic range of scales is crucial so some of our lessons can certainly be carried over to other fields.”

As Alex Szalay friom the Johns Hopkins University Department of Physics and Astonomy noted, this is a prime example of the kinds of big data problems that researchers in astrophysics and other fields are facing. They are, as he told us, “forced to make tradoffs when they enter the extreme scale” and need to find ways to manage both storage and CPU resources so that these tradeoffs have the least possible impact on the overall time to solutions. Dr. Szalay addressed some of the specific challenges involved in Wechsler’s project in a recent presentation called “Extreme Databases-Centric Scientific Computing.” In the presentation he addresses the new scalable architectures required for data-intensive scientific applications, looking at the databases as the root point to begin exploring new solutions.

For the dark energy survey, the team will take images of about one-eighth of the sky going back seven billion years. The large synoptic survey telescope, which is currently being built will take images of the half the sky every three days and will provide even more faintness detection, detecting the brightest stars back to a few billion years after the big bang. One goal with this is to map where everything is in order to figure out what the universe is made of. Galaxy surveys help with this research because they can map the physics to large events via simulations to understand galactic evolution.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

InfiniBand Still Tops in Supercomputing

July 19, 2018

In the competitive global HPC landscape, system and processor vendors, nations and end user sites certainly get a lot of attention--deservedly so--but more than ever, the network plays a crucial role. While fast, perform Read more…

By Tiffany Trader

HPC for Life: Genomics, Brain Research, and Beyond

July 19, 2018

During the past few decades, the life sciences have witnessed one landmark discovery after another with the aid of HPC, paving the way toward a new era of personalized treatments based on an individual’s genetic makeup Read more…

By Warren Froelich

WCRP’s New Strategic Plan for Climate Research Highlights the Importance of HPC

July 19, 2018

As climate modeling increasingly leverages exascale computing and researchers warn of an impending computing gap in climate research, the World Climate Research Programme (WCRP) is developing its new Strategic Plan – and high-performance computing is slated to play a critical role. Read more…

By Oliver Peckham

HPE Extreme Performance Solutions

Introducing the First Integrated System Management Software for HPC Clusters from HPE

How do you manage your complex, growing cluster environments? Answer that big challenge with the new HPC cluster management solution: HPE Performance Cluster Manager. Read more…

IBM Accelerated Insights

Are Your Software Licenses Impeding Your Productivity?

In my previous article, Improving chip yield rates with cognitive manufacturing, I highlighted the costs associated with semiconductor manufacturing, and how cognitive methods can yield benefits in both design and manufacture.  Read more…

U.S. Exascale Computing Project Releases Software Technology Progress Report

July 19, 2018

As is often noted the race to exascale computing isn’t just about hardware. This week the U.S. Exascale Computing Project (ECP) released its latest Software Technology (ST) Capability Assessment Report detailing progress so far. Read more…

By John Russell

InfiniBand Still Tops in Supercomputing

July 19, 2018

In the competitive global HPC landscape, system and processor vendors, nations and end user sites certainly get a lot of attention--deservedly so--but more than Read more…

By Tiffany Trader

HPC for Life: Genomics, Brain Research, and Beyond

July 19, 2018

During the past few decades, the life sciences have witnessed one landmark discovery after another with the aid of HPC, paving the way toward a new era of perso Read more…

By Warren Froelich

D-Wave Breaks New Ground in Quantum Simulation

July 16, 2018

Last Friday D-Wave scientists and colleagues published work in Science which they say represents the first fulfillment of Richard Feynman’s 1982 notion that Read more…

By John Russell

AI Thought Leaders on Capitol Hill

July 14, 2018

On Thursday, July 12, the House Committee on Science, Space, and Technology heard from four academic and industry leaders – representatives from Berkeley Lab, Argonne Lab, GE Global Research and Carnegie Mellon University – on the opportunities springing from the intersection of machine learning and advanced-scale computing. Read more…

By Tiffany Trader

HPC Serves as a ‘Rosetta Stone’ for the Information Age

July 12, 2018

In an age defined and transformed by its data, several large-scale scientific instruments around the globe might be viewed as a ‘mother lode’ of precious data. With names seemingly created for a ‘techno-speak’ glossary, these interferometers, cyclotrons, sequencers, solenoids, satellite altimeters, and cryo-electron microscopes are churning out data in previously unthinkable and seemingly incomprehensible quantities -- billions, trillions and quadrillions of bits and bytes of electro-magnetic code. Read more…

By Warren Froelich

Tsinghua Powers Through ISC18 Field

July 10, 2018

Tsinghua University topped all other competitors at the ISC18 Student Cluster Competition with an overall score of 88.43 out of 100. This gives Tsinghua their s Read more…

By Dan Olds

HPE, EPFL Launch Blue Brain 5 Supercomputer

July 10, 2018

HPE and the Ecole Polytechnique Federale de Lausannne (EPFL) Blue Brain Project yesterday introduced Blue Brain 5, a new supercomputer built by HPE, which displ Read more…

By John Russell

Pumping New Life into HPC Clusters, the Case for Liquid Cooling

July 10, 2018

High Performance Computing (HPC) faces some daunting challenges in the coming years as traditional, industry-standard systems push the boundaries of data center Read more…

By Scott Tease

Leading Solution Providers

SC17 Booth Video Tours Playlist

Altair @ SC17

Altair

AMD @ SC17

AMD

ASRock Rack @ SC17

ASRock Rack

CEJN @ SC17

CEJN

DDN Storage @ SC17

DDN Storage

Huawei @ SC17

Huawei

IBM @ SC17

IBM

IBM Power Systems @ SC17

IBM Power Systems

Intel @ SC17

Intel

Lenovo @ SC17

Lenovo

Mellanox Technologies @ SC17

Mellanox Technologies

Microsoft @ SC17

Microsoft

Penguin Computing @ SC17

Penguin Computing

Pure Storage @ SC17

Pure Storage

Supericro @ SC17

Supericro

Tyan @ SC17

Tyan

Univa @ SC17

Univa

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This