OSC Helps Researchers Unveil Most Accurate Map of the Invisible Universe

August 10, 2017

COLUMBUS, Ohio, Aug. 10, 2017 — The Ohio Supercomputer Center played a critical role in helping researchers reach a milestone mapping the growth of the universe from its infancy to present day.

The new results released Aug. 3 confirm the surprisingly simple but puzzling theory that the present universe is composed of only 4 percent ordinary matter, 26 percent mysterious dark matter, and the remaining 70 percent in the form of mysterious dark energy, which causes the accelerating expansion of the universe.

The findings from researchers at The Ohio State University and their colleagues from the Dark Energy Survey (DES) collaboration are based on data collected during the first year of the DES, which covers more than 1,300 square degrees of the sky or about the area of 6,000 full moons. DES uses the Dark Energy Camera mounted on the Blanco 4m telescope at the Cerro Tololo Inter-American Observatory high in the Chilean Andes.

According to Klaus Honscheid, Ph.D., professor of physics and leader of the Ohio State DES group, OSC was critical to getting the research done in a timely manner. His computational specialists – Michael Troxel and Niall MacCrann, postdoctoral fellows – used an estimated 300,000 core hours on OSC’s Ruby Cluster through a condo arrangement between OSC and Ohio State’s Center of Cosmology and Astro-Particle Physics (CCAPP).

The team took advantage of OSC’s Anaconda environment for standard work, Anaconda, an open-source package of the Python and R programming languages for large-scale data processing, predictive analytics and scientific computing. The group then used its own software to evaluate the multi-dimensional parameter space using Markov Chain Monte Carlo techniques, which is used to generate fair samples from a probability. The team also ran validation code, or null tests, for object selection and fitting code to extract information about objects in the images obtained by simultaneously fitting the same object in all available exposures of the particular object.

The bulk of the team’s 4 million computational allocations are at the National Energy Research Scientific Computing Center (NERSC), a federal supercomputing facility in California. However, due to a backlog at NERSC, OSC’s role became key.

According to Honscheid, for the next analysis round the team is considering increasing the amount of work done through OSC. The total survey will last five years, he said, meaning the need for high performance computing will only increase.

In order to collect the data, the team built an incredibly powerful camera for the Blanco 4m telescope.

“We had to construct the most powerful instrument of its kind. It is sensitive enough to collect light from galaxies 8 billion light years away,” said Honscheid.

Key components of the 570 mega-pixel camera were built at Ohio State.

Paradoxically, it is easier to measure the structure of the universe in the distant past than it is to measure it today. In the first 400,000 years after the Big Bang, the universe was filled with a glowing gas, the light from which survives to this day. This cosmic microwave background (CMB) radiation provides a snapshot of the universe at that early time. Since then, the gravity of dark matter has pulled mass together and made the universe clumpier. But dark energy has been fighting back, pushing matter apart. Using the CMB as a start, cosmologists can calculate precisely how this battle plays out over 14 billion years.

“With the new results, we are able for the first time to see the current structure of the universe with a similar level of clarity as we can see its infancy. Dark energy is needed to explain how the infant universe evolved to what we observe now,” said MacCrann, a major contributor to the analysis.

DES scientists used two methods to measure dark matter. First, they created maps of galaxy positions as tracers; secondly, they precisely measured the shapes of 26 million galaxies to directly map the patterns of dark matter over billions of light years, using a technique called gravitational lensing. Ashley Ross of CCAPP, leader of the DES large-scale structure working group, said “For the first time we were able to perform these studies with data from the same experiment allowing us to obtain the most accurate results to date.”

To make these ultra-precise measurements, the DES team developed new ways to detect the tiny lensing distortions of galaxy images, an effect not even visible to the eye, enabling revolutionary advances in understanding these cosmic signals. In the process, they created the largest guide to spotting dark matter in the cosmos ever drawn (see image). The new dark matter map is 10 times the size of the one DES released in 2015 and will eventually be three times larger than it is now.

A large scientific team achieved these results working in seven countries across three continents.

“Successful collaboration at this scale represents many years of deep commitment, collective vision, and sustained effort,” said Ami Choi, CCAPP postdoctoral fellow who worked on the galaxy shape measurements.

Michael Troxel, CCAPP postdoctoral fellow and leader of the weak gravitational lensing analysis, added, “These results are based on unprecedented statistical power and detailed understanding of the telescope and potential biases in the analysis. Crucially, we performed a ‘blind’ analysis, in which we finalized all aspects of the analysis before we knew the results, thereby avoiding confirmation biases.”

The DES measurements of the present universe agree with the results obtained by the Planck satellite that studied the cosmic microwave background radiation from a time when the universe was just 400,000 years old.

“The moment we realized that our measurement matched the Planck result within 7% was thrilling for the entire collaboration,” said Honscheid. “And this is just the beginning for DES with more data already observed. With one more observing season to go, we expect to ultimately use five times more data to learn more about the enigmatic dark sector of the universe.”

The new results from the Dark Energy Survey will be presented by Kavli fellow Elisabeth Krause at the TeV Particle Astrophysics Conference in Columbuson Aug. 9, and by CCAPP’s Troxel at the International Symposium on Lepton Photon Interactions at High Energies in Guanzhou, China, on Aug. 10.

The publications can be accessed on the Dark Energy Survey website.

Ohio State University is an institutional member of the Dark Energy Survey collaboration. Funding for this research comes in part from  Ohio State’s Center for Cosmology and Astro-Particle Physics. The Ohio Supercomputer Center provided a portion of the computing power for this project.

The Ohio State DES team includes Honscheid; Paul Martini and David Weinberg, both professors of astronomy; Choi, Ross, MacCrann, and Troxel, all postdoctoral fellows at CCAPP; and doctoral students Su-Jeong Lee and Hui Kong.


Source: OSC

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penalties to HPC applications. Even as these patches are rolled o Read more…

By Pete Beckman

Intel Touts Silicon Spin Qubits for Quantum Computing

February 14, 2018

Debate around what makes a good qubit and how best to manufacture them is a sprawling topic. There are many insistent voices favoring one or another approach. Referencing a paper published today in Nature, Intel has offe Read more…

By John Russell

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

HPE Extreme Performance Solutions

Safeguard Your HPC Environment with the World’s Most Secure Industry Standard Servers

Today’s organizations operate in an environment with ever-evolving threats, and in order to protect themselves they must continuously bolster their security strategy. Hewlett Packard Enterprise (HPE) and Intel® are addressing modern security challenges with the world’s most secure industry standard servers powered by the latest generation of Intel® Xeon® Scalable processors. Read more…

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended to make it easier, faster and cheaper to train and run machi Read more…

By Doug Black

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penal Read more…

By Pete Beckman

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

The Food Industry’s Next Journey — from Mars to Exascale

February 12, 2018

Global food producer and one of the world's leading chocolate companies Mars Inc. has a unique perspective on the impact that exascale computing will have on the food industry. Read more…

By Scott Gibson, Oak Ridge National Laboratory

Singularity HPC Container Start-Up – Sylabs – Emerges from Stealth

February 8, 2018

The driving force behind Singularity, the popular HPC container technology, is bringing the open source platform to the enterprise with the launch of a new vent Read more…

By George Leopold

Dell EMC Debuts PowerEdge Servers with AMD EPYC Chips

February 6, 2018

AMD notched another EPYC processor win today with Dell EMC’s introduction of three PowerEdge servers (R6415, R7415, and R7425) based on the EPYC 7000-series p Read more…

By John Russell

‘Next Generation’ Universe Simulation Is Most Advanced Yet

February 5, 2018

The research group that gave us the most detailed time-lapse simulation of the universe’s evolution in 2014, spanning 13.8 billion years of cosmic evolution, is back in the spotlight with an even more advanced cosmological model that is providing new insights into how black holes influence the distribution of dark matter, how heavy elements are produced and distributed, and where magnetic fields originate. Read more…

By Tiffany Trader

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

Leading Solution Providers

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

V100 Good but not Great on Select Deep Learning Aps, Says Xcelerit

November 27, 2017

Wringing optimum performance from hardware to accelerate deep learning applications is a challenge that often depends on the specific application in use. A benc Read more…

By John Russell

SC17: Singularity Preps Version 3.0, Nears 1M Containers Served Daily

November 1, 2017

Just a few months ago about half a million jobs were being run daily using Singularity containers, the LBNL-founded container platform intended for HPC. That wa Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Share This