OSC Helps Researchers Unveil Most Accurate Map of the Invisible Universe

August 10, 2017

COLUMBUS, Ohio, Aug. 10, 2017 — The Ohio Supercomputer Center played a critical role in helping researchers reach a milestone mapping the growth of the universe from its infancy to present day.

The new results released Aug. 3 confirm the surprisingly simple but puzzling theory that the present universe is composed of only 4 percent ordinary matter, 26 percent mysterious dark matter, and the remaining 70 percent in the form of mysterious dark energy, which causes the accelerating expansion of the universe.

The findings from researchers at The Ohio State University and their colleagues from the Dark Energy Survey (DES) collaboration are based on data collected during the first year of the DES, which covers more than 1,300 square degrees of the sky or about the area of 6,000 full moons. DES uses the Dark Energy Camera mounted on the Blanco 4m telescope at the Cerro Tololo Inter-American Observatory high in the Chilean Andes.

According to Klaus Honscheid, Ph.D., professor of physics and leader of the Ohio State DES group, OSC was critical to getting the research done in a timely manner. His computational specialists – Michael Troxel and Niall MacCrann, postdoctoral fellows – used an estimated 300,000 core hours on OSC’s Ruby Cluster through a condo arrangement between OSC and Ohio State’s Center of Cosmology and Astro-Particle Physics (CCAPP).

The team took advantage of OSC’s Anaconda environment for standard work, Anaconda, an open-source package of the Python and R programming languages for large-scale data processing, predictive analytics and scientific computing. The group then used its own software to evaluate the multi-dimensional parameter space using Markov Chain Monte Carlo techniques, which is used to generate fair samples from a probability. The team also ran validation code, or null tests, for object selection and fitting code to extract information about objects in the images obtained by simultaneously fitting the same object in all available exposures of the particular object.

The bulk of the team’s 4 million computational allocations are at the National Energy Research Scientific Computing Center (NERSC), a federal supercomputing facility in California. However, due to a backlog at NERSC, OSC’s role became key.

According to Honscheid, for the next analysis round the team is considering increasing the amount of work done through OSC. The total survey will last five years, he said, meaning the need for high performance computing will only increase.

In order to collect the data, the team built an incredibly powerful camera for the Blanco 4m telescope.

“We had to construct the most powerful instrument of its kind. It is sensitive enough to collect light from galaxies 8 billion light years away,” said Honscheid.

Key components of the 570 mega-pixel camera were built at Ohio State.

Paradoxically, it is easier to measure the structure of the universe in the distant past than it is to measure it today. In the first 400,000 years after the Big Bang, the universe was filled with a glowing gas, the light from which survives to this day. This cosmic microwave background (CMB) radiation provides a snapshot of the universe at that early time. Since then, the gravity of dark matter has pulled mass together and made the universe clumpier. But dark energy has been fighting back, pushing matter apart. Using the CMB as a start, cosmologists can calculate precisely how this battle plays out over 14 billion years.

“With the new results, we are able for the first time to see the current structure of the universe with a similar level of clarity as we can see its infancy. Dark energy is needed to explain how the infant universe evolved to what we observe now,” said MacCrann, a major contributor to the analysis.

DES scientists used two methods to measure dark matter. First, they created maps of galaxy positions as tracers; secondly, they precisely measured the shapes of 26 million galaxies to directly map the patterns of dark matter over billions of light years, using a technique called gravitational lensing. Ashley Ross of CCAPP, leader of the DES large-scale structure working group, said “For the first time we were able to perform these studies with data from the same experiment allowing us to obtain the most accurate results to date.”

To make these ultra-precise measurements, the DES team developed new ways to detect the tiny lensing distortions of galaxy images, an effect not even visible to the eye, enabling revolutionary advances in understanding these cosmic signals. In the process, they created the largest guide to spotting dark matter in the cosmos ever drawn (see image). The new dark matter map is 10 times the size of the one DES released in 2015 and will eventually be three times larger than it is now.

A large scientific team achieved these results working in seven countries across three continents.

“Successful collaboration at this scale represents many years of deep commitment, collective vision, and sustained effort,” said Ami Choi, CCAPP postdoctoral fellow who worked on the galaxy shape measurements.

Michael Troxel, CCAPP postdoctoral fellow and leader of the weak gravitational lensing analysis, added, “These results are based on unprecedented statistical power and detailed understanding of the telescope and potential biases in the analysis. Crucially, we performed a ‘blind’ analysis, in which we finalized all aspects of the analysis before we knew the results, thereby avoiding confirmation biases.”

The DES measurements of the present universe agree with the results obtained by the Planck satellite that studied the cosmic microwave background radiation from a time when the universe was just 400,000 years old.

“The moment we realized that our measurement matched the Planck result within 7% was thrilling for the entire collaboration,” said Honscheid. “And this is just the beginning for DES with more data already observed. With one more observing season to go, we expect to ultimately use five times more data to learn more about the enigmatic dark sector of the universe.”

The new results from the Dark Energy Survey will be presented by Kavli fellow Elisabeth Krause at the TeV Particle Astrophysics Conference in Columbuson Aug. 9, and by CCAPP’s Troxel at the International Symposium on Lepton Photon Interactions at High Energies in Guanzhou, China, on Aug. 10.

The publications can be accessed on the Dark Energy Survey website.

Ohio State University is an institutional member of the Dark Energy Survey collaboration. Funding for this research comes in part from  Ohio State’s Center for Cosmology and Astro-Particle Physics. The Ohio Supercomputer Center provided a portion of the computing power for this project.

The Ohio State DES team includes Honscheid; Paul Martini and David Weinberg, both professors of astronomy; Choi, Ross, MacCrann, and Troxel, all postdoctoral fellows at CCAPP; and doctoral students Su-Jeong Lee and Hui Kong.


Source: OSC

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Glimpses of Today’s Total Solar Eclipse

August 21, 2017

Here are a few arresting images posted by NASA of today’s total solar eclipse. Such astronomical events have always captured our imagination and it’s not hard to understand why such occurrences were often greeted wit Read more…

By John Russell

Tech Giants Outline Battle Plans for Future HPC Market

August 21, 2017

Four companies engaged in a cage fight for leadership in the emerging HPC market of the 2020s are, despite deep differences in some areas, in violent agreement on at least one thing: the power consumption and latency pen Read more…

By Doug Black

Geospatial Data Research Leverages GPUs

August 17, 2017

MapD Technologies, the GPU-accelerated database specialist, said it is working with university researchers on leveraging graphics processors to advance geospatial analytics. The San Francisco-based company is collabor Read more…

By George Leopold

HPE Extreme Performance Solutions

Leveraging Deep Learning for Fraud Detection

Advancements in computing technologies and the expanding use of e-commerce platforms have dramatically increased the risk of fraud for financial services companies and their customers. Read more…

Intel, NERSC and University Partners Launch New Big Data Center

August 17, 2017

A collaboration between the Department of Energy’s National Energy Research Scientific Computing Center (NERSC), Intel and five Intel Parallel Computing Centers (IPCCs) has resulted in a new Big Data Center (BDC) that Read more…

By Linda Barney

Tech Giants Outline Battle Plans for Future HPC Market

August 21, 2017

Four companies engaged in a cage fight for leadership in the emerging HPC market of the 2020s are, despite deep differences in some areas, in violent agreement Read more…

By Doug Black

Microsoft Bolsters Azure With Cloud HPC Deal

August 15, 2017

Microsoft has acquired cloud computing software vendor Cycle Computing in a move designed to bring orchestration tools along with high-end computing access capabilities to the cloud. Terms of the acquisition were not disclosed. Read more…

By George Leopold

HPE Ships Supercomputer to Space Station, Final Destination Mars

August 14, 2017

With a manned mission to Mars on the horizon, the demand for space-based supercomputing is at hand. Today HPE and NASA sent the first off-the-shelf HPC system i Read more…

By Tiffany Trader

AMD EPYC Video Takes Aim at Intel’s Broadwell

August 14, 2017

Let the benchmarking begin. Last week, AMD posted a YouTube video in which one of its EPYC-based systems outperformed a ‘comparable’ Intel Broadwell-based s Read more…

By John Russell

Deep Learning Thrives in Cancer Moonshot

August 8, 2017

The U.S. War on Cancer, certainly a worthy cause, is a collection of programs stretching back more than 40 years and abiding under many banners. The latest is t Read more…

By John Russell

IBM Raises the Bar for Distributed Deep Learning

August 8, 2017

IBM is announcing today an enhancement to its PowerAI software platform aimed at facilitating the practical scaling of AI models on today’s fastest GPUs. Scal Read more…

By Tiffany Trader

IBM Storage Breakthrough Paves Way for 330TB Tape Cartridges

August 3, 2017

IBM announced yesterday a new record for magnetic tape storage that it says will keep tape storage density on a Moore's law-like path far into the next decade. Read more…

By Tiffany Trader

AMD Stuffs a Petaflops of Machine Intelligence into 20-Node Rack

August 1, 2017

With its Radeon “Vega” Instinct datacenter GPUs and EPYC “Naples” server chips entering the market this summer, AMD has positioned itself for a two-head Read more…

By Tiffany Trader

How ‘Knights Mill’ Gets Its Deep Learning Flops

June 22, 2017

Intel, the subject of much speculation regarding the delayed, rewritten or potentially canceled “Aurora” contract (the Argonne Lab part of the CORAL “ Read more…

By Tiffany Trader

Nvidia’s Mammoth Volta GPU Aims High for AI, HPC

May 10, 2017

At Nvidia's GPU Technology Conference (GTC17) in San Jose, Calif., this morning, CEO Jensen Huang announced the company's much-anticipated Volta architecture a Read more…

By Tiffany Trader

Reinders: “AVX-512 May Be a Hidden Gem” in Intel Xeon Scalable Processors

June 29, 2017

Imagine if we could use vector processing on something other than just floating point problems.  Today, GPUs and CPUs work tirelessly to accelerate algorithms Read more…

By James Reinders

Russian Researchers Claim First Quantum-Safe Blockchain

May 25, 2017

The Russian Quantum Center today announced it has overcome the threat of quantum cryptography by creating the first quantum-safe blockchain, securing cryptocurrencies like Bitcoin, along with classified government communications and other sensitive digital transfers. Read more…

By Doug Black

Quantum Bits: D-Wave and VW; Google Quantum Lab; IBM Expands Access

March 21, 2017

For a technology that’s usually characterized as far off and in a distant galaxy, quantum computing has been steadily picking up steam. Just how close real-wo Read more…

By John Russell

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Groq This: New AI Chips to Give GPUs a Run for Deep Learning Money

April 24, 2017

CPUs and GPUs, move over. Thanks to recent revelations surrounding Google’s new Tensor Processing Unit (TPU), the computing world appears to be on the cusp of Read more…

By Alex Woodie

HPC Compiler Company PathScale Seeks Life Raft

March 23, 2017

HPCwire has learned that HPC compiler company PathScale has fallen on difficult times and is asking the community for help or actively seeking a buyer for its a Read more…

By Tiffany Trader

Leading Solution Providers

Trump Budget Targets NIH, DOE, and EPA; No Mention of NSF

March 16, 2017

President Trump’s proposed U.S. fiscal 2018 budget issued today sharply cuts science spending while bolstering military spending as he promised during the cam Read more…

By John Russell

Google Debuts TPU v2 and will Add to Google Cloud

May 25, 2017

Not long after stirring attention in the deep learning/AI community by revealing the details of its Tensor Processing Unit (TPU), Google last week announced the Read more…

By John Russell

CPU-based Visualization Positions for Exascale Supercomputing

March 16, 2017

In this contributed perspective piece, Intel’s Jim Jeffers makes the case that CPU-based visualization is now widely adopted and as such is no longer a contrarian view, but is rather an exascale requirement. Read more…

By Jim Jeffers, Principal Engineer and Engineering Leader, Intel

Six Exascale PathForward Vendors Selected; DoE Providing $258M

June 15, 2017

The much-anticipated PathForward awards for hardware R&D in support of the Exascale Computing Project were announced today with six vendors selected – AMD Read more…

By John Russell

Top500 Results: Latest List Trends and What’s in Store

June 19, 2017

Greetings from Frankfurt and the 2017 International Supercomputing Conference where the latest Top500 list has just been revealed. Although there were no major Read more…

By Tiffany Trader

IBM Clears Path to 5nm with Silicon Nanosheets

June 5, 2017

Two years since announcing the industry’s first 7nm node test chip, IBM and its research alliance partners GlobalFoundries and Samsung have developed a proces Read more…

By Tiffany Trader

Messina Update: The US Path to Exascale in 16 Slides

April 26, 2017

Paul Messina, director of the U.S. Exascale Computing Project, provided a wide-ranging review of ECP’s evolving plans last week at the HPC User Forum. Read more…

By John Russell

Graphcore Readies Launch of 16nm Colossus-IPU Chip

July 20, 2017

A second $30 million funding round for U.K. AI chip developer Graphcore sets up the company to go to market with its “intelligent processing unit” (IPU) in Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This