Dark Energy Survey Releases Most Precise Look at the Universe’s Evolution

June 7, 2021

June 7, 2021 — New results from the Dark Energy Survey use the largest ever sample of galaxies over an enormous piece of the sky to produce the most precise measurements of the universe’s composition and growth to date. Scientists measured that the way matter is distributed throughout the universe is consistent with predictions in the standard cosmological model, the best current model of the universe.

Over the course of six years, DES surveyed 5,000 square degrees—almost one-eighth of the entire sky—in 758 nights of observation, cataloguing hundreds of millions of objects. The results announced today draw on data from the first three years—226 million galaxies observed over 345 nights—to create the largest and most precise maps yet of the distribution of galaxies in the universe at relatively recent epochs.

Since DES studied nearby galaxies as well as those billions of light-years away, its maps provide both a snapshot of the current large-scale structure of the universe and a movie of how that structure has evolved over the course of the past 7 billion years.

To test cosmologists’ current model of the universe, DES scientists compared their results with measurements from the European Space Agency’s orbiting Planck observatory. Planck used light signals known as the cosmic microwave background to peer back to the early universe, just 400,000 years after the Big Bang. The Planck data give a precise view of the universe 13 billion years ago, and the standard cosmological model predicts how the dark matter should evolve to the present. If DES’s observations don’t match this prediction, there is possibly an undiscovered aspect to the universe. While there have been persistent hints from DES and several previous galaxy surveys that the current universe is a few percent less clumpy than predicted—an intriguing find worthy of further investigation—the recently released results are consistent with the prediction.

“In the area of constraining what we know about the distribution and structure of matter on large scales as driven by dark matter and dark energy, DES has obtained limits that rival and complement those from the cosmic microwave background,” said Brian Yanny, a Fermilab scientist who coordinated DES data processing and management. “It’s exciting to have precise measurements of what’s out there and a better understanding of how the universe has changed from its infancy through to today.”

Ordinary matter makes up only about 5% of the universe. Dark energy, which cosmologists hypothesize drives the accelerating expansion of the universe by counteracting the force of gravity, accounts for about 70%. The last 25% is dark matter, whose gravitational influence binds galaxies together. Both dark matter and dark energy remain invisible and mysterious, but DES seeks to illuminate their natures by studying how the competition between them shapes the large-scale structure of the universe over cosmic time.

Ten areas in the sky were selected as “deep fields” that the Dark Energy Camera imaged multiple times during the survey, providing a glimpse of distant galaxies and helping determine their 3-D distribution in the cosmos. Photo: Dark Energy Survey

DES photographed the night sky using the 570-megapixel Dark Energy Camera on the Victor M. Blanco 4-meter Telescope at the Cerro Tololo Inter-American Observatory in Chile, a Program of the National Science Foundation’s NOIRLab. One of the most powerful digital cameras in the world, the Dark Energy Camera was designed specifically for DES and built and tested at Fermilab. The DES data were processed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

“These analyses are truly state-of-the-art, requiring artificial intelligence and high-performance computing super-charged by the smartest young scientists around,” said Scott Dodelson, a physicist at Carnegie Mellon University who co-leads the DES Science Committee with Elisabeth Krause of the University of Arizona. “What an honor to be part of this team.”

To quantify the distribution of dark matter and the effect of dark energy, DES relied on two main phenomena. First, on large scales, galaxies are not distributed randomly throughout space but rather form a weblike structure due to the gravity of dark matter. DES measured how this cosmic web has evolved over the history of the universe. The galaxy clustering that forms the cosmic web, in turn, revealed regions with a higher density of dark matter.

Second, DES detected the signature of dark matter through weak gravitational lensing. As light from a distant galaxy travels through space, the gravity of both ordinary and dark matter can bend it, resulting in a distorted image of the galaxy as seen from Earth. By studying how the apparent shapes of distant galaxies are aligned with each other and with the positions of nearby galaxies along the line of sight, DES scientists inferred the spatial distribution (or clumpiness) of the dark matter in the universe.

Analyzing the massive amounts of data collected by DES was a formidable undertaking. The team began by analyzing just the first year of data, which was released in 2017. That process prepared the researchers to use more sophisticated techniques for analyzing the larger data set, which includes the largest sample of galaxies ever used to study weak gravitational lensing.

For example, calculating the redshift of a galaxy—the change in light’s wavelength due to the expansion of the universe—is a key step toward measuring how both galaxy clustering and weak gravitational lensing change over cosmic history. The redshift of a galaxy is related to its distance, which allows the clustering to be characterized in both space and time.

“There was significant improvement in how to calibrate the redshift distributions of the galaxy samples,” said Judit Prat, a postdoc at the University of Chicago who analyzed weak gravitational lensing as captured by DES. “This was a huge effort that people put a lot of work into. We now have a method that nobody has used before, and it’s very robust.”

Ten regions of the sky were chosen as “deep fields” that the Dark Energy Camera imaged repeatedly throughout the survey. Stacking those images together allowed the scientists to glimpse more distant galaxies. The team then used the redshift information from the deep fields to calibrate measurements of redshift in the rest of the survey region. This and other advancements in measurements and modeling, coupled with a threefold increase in data compared to the first year, enabled the team to pin down the density and clumpiness of the universe with unprecedented precision.

Along with the analysis of the weak-lensing signals, DES also precisely measures other probes that constrain the cosmological model in independent ways: galaxy clustering on larger scales (baryon acoustic oscillations), the frequency of massive clusters of galaxies, and high-precision measurements of the brightnesses and redshifts of Type Ia supernovae. These additional measurements will be combined with the current weak-lensing analysis to yield even more stringent constraints on the standard model.

“DES has delivered cost-effective, leading-edge science results directly related to Fermilab’s mission of pursuing the fundamental nature of matter, energy, space and time,” said Fermilab Director Nigel Lockyer. “A dedicated team of scientists, engineers and technicians from institutions around the world brought DES to fruition.”

The DES collaboration consists of over 400 scientists from 25 institutions in seven countries.

“The collaboration is remarkably young. It’s tilted strongly in the direction of postdocs and graduate students who are doing a huge amount of this work,” said DES Director and spokesperson Rich Kron, who is a Fermilab and University of Chicago scientist. “That’s really gratifying. A new generation of cosmologists are being trained using the Dark Energy Survey.”

DES concluded observations of the night sky in 2019. With the experience of analyzing the first half of the data, the team is now prepared to handle the complete data set. The final DES analysis is expected to paint an even more precise picture of the dark matter and dark energy in the universe. And the methods developed by the team have paved the way for future sky surveys to probe the mysteries of the cosmos.

“The real legacy of DES will be the leaps forward we’ve had to make that were essential for this key result, and which will be critical for the next generation of cosmological experiments starting soon,” said Michael Troxel, a physicist at Duke University and the key project coordinator for the DES three-year data analysis. Upcoming experiments include both space-based imaging experiments and ground-based surveys such as the Vera C. Rubin Observatory Legacy Survey of Space and Time.

“With these instruments we’ve built to stare into the dark, we are working to solve universal mysteries,” said Troxel.

The recent DES results will be presented in a scientific seminar on May 27. Twenty-nine papers are available on the arXiv online repository.

ABOUT THE DARK ENERGY SURVEY

The Dark Energy Survey is a collaboration of more than 400 scientists from 25 institutions in seven countries.

Funding for the DES Projects has been provided by the U.S. Department of Energy, the U.S. National Science Foundation, the Ministry of Science and Education of Spain, the Science and Technology Facilities Council of the United Kingdom, the Higher Education Funding Council for England, the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, the Kavli Institute of Cosmological Physics at the University of Chicago, Funding Authority for Funding and Projects in Brazil, Carlos Chagas Filho Foundation for Research Support of the State of Rio de Janeiro, Brazilian National Council for Scientific and Technological Development and the Ministry of Science and Technology, the German Research Foundation and the collaborating institutions in the Dark Energy Survey.

ABOUT THE CERRO TOLOLO INTER-AMERICAN OBSERVATORY

The Cerro Tololo Inter-American Observatory is a Program of NSF’s NOIRLab, which is operated by the Association of Universities for Research in Astronomy (AURA) under a cooperative agreement with the National Science Foundation. NSF is an independent federal agency created by Congress in 1950 to promote the progress of science. NSF supports basic research and people to create knowledge that transforms the future.

ABOUT NCSA

The National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students and collaborators from around the globe use these resources to address research challenges for the benefit of science and society. NCSA has been advancing many of the world’s industry giants for over 35 years by bringing industry, researchers and students together to solve grand challenges at rapid speed and scale.

ABOUT FERMILAB

Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @Fermilab.

ABOUT THE DOE OFFICE OF SCIENCE

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Click here to watch a video and learn more.


Source: NCSA

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Intel Reorgs HPC Group, Creates Two ‘Super Compute’ Groups

October 15, 2021

Following on changes made in June that moved Intel’s HPC unit out of the Data Platform Group and into the newly created Accelerated Computing Systems and Graphics (AXG) business unit, led by Raja Koduri, Intel is making further updates to the HPC group and announcing... Read more…

Royalty-free stock illustration ID: 1938746143

MosaicML, Led by Naveen Rao, Comes Out of Stealth Aiming to Ease Model Training

October 15, 2021

With more and more enterprises turning to AI for a myriad of tasks, companies quickly find out that training AI models is expensive, difficult and time-consuming. Finding a new approach to deal with those cascading challenges is the aim of a new startup, MosaicML, that just came out of stealth... Read more…

NSF Awards $11M to SDSC, MIT and Univ. of Oregon to Secure the Internet

October 14, 2021

From a security standpoint, the internet is a problem. The infrastructure developed decades ago has cracked, leaked and been patched up innumerable times, leaving vulnerabilities that are difficult to address due to cost Read more…

SC21 Announces Science and Beyond Plenary: the Intersection of Ethics and HPC

October 13, 2021

The Intersection of Ethics and HPC will be the guiding topic of SC21's Science & Beyond plenary, inspired by the event tagline of the same name. The evening event will be moderated by Daniel Reed with panelists Crist Read more…

Quantum Workforce – NSTC Report Highlights Need for International Talent

October 13, 2021

Attracting and training the needed quantum workforce to fuel the ongoing quantum information sciences (QIS) revolution is a hot topic these days. Last week, the U.S. National Science and Technology Council issued a report – The Role of International Talent in Quantum Information Science... Read more…

AWS Solution Channel

Cost optimizing Ansys LS-Dyna on AWS

Organizations migrate their high performance computing (HPC) workloads from on-premises infrastructure to Amazon Web Services (AWS) for advantages such as high availability, elastic capacity, latest processors, storage, and networking technologies; Read more…

Eni Returns to HPE for ‘HPC4’ Refresh via GreenLake

October 13, 2021

Italian energy company Eni is upgrading its HPC4 system with new gear from HPE that will be installed in Eni’s Green Data Center in Ferrera Erbognone (a province in Pavia, Italy), and delivered “as-a-service” via H Read more…

Intel Reorgs HPC Group, Creates Two ‘Super Compute’ Groups

October 15, 2021

Following on changes made in June that moved Intel’s HPC unit out of the Data Platform Group and into the newly created Accelerated Computing Systems and Graphics (AXG) business unit, led by Raja Koduri, Intel is making further updates to the HPC group and announcing... Read more…

Royalty-free stock illustration ID: 1938746143

MosaicML, Led by Naveen Rao, Comes Out of Stealth Aiming to Ease Model Training

October 15, 2021

With more and more enterprises turning to AI for a myriad of tasks, companies quickly find out that training AI models is expensive, difficult and time-consuming. Finding a new approach to deal with those cascading challenges is the aim of a new startup, MosaicML, that just came out of stealth... Read more…

Quantum Workforce – NSTC Report Highlights Need for International Talent

October 13, 2021

Attracting and training the needed quantum workforce to fuel the ongoing quantum information sciences (QIS) revolution is a hot topic these days. Last week, the U.S. National Science and Technology Council issued a report – The Role of International Talent in Quantum Information Science... Read more…

Eni Returns to HPE for ‘HPC4’ Refresh via GreenLake

October 13, 2021

Italian energy company Eni is upgrading its HPC4 system with new gear from HPE that will be installed in Eni’s Green Data Center in Ferrera Erbognone (a provi Read more…

The Blueprint for the National Strategic Computing Reserve

October 12, 2021

Over the last year, the HPC community has been buzzing with the possibility of a National Strategic Computing Reserve (NSCR). An in-utero brainchild of the COVID-19 High-Performance Computing Consortium, an NSCR would serve as a Merchant Marine for urgent computing... Read more…

UCLA Researchers Report Largest Chiplet Design and Early Prototyping

October 12, 2021

What’s the best path forward for large-scale chip/system integration? Good question. Cerebras has set a high bar with its wafer scale engine 2 (WSE-2); it has 2.6 trillion transistors, including 850,000 cores, and was fabricated using TSMC’s 7nm process on a roughly 8” x 8” silicon footprint. Read more…

What’s Next for EuroHPC: an Interview with EuroHPC Exec. Dir. Anders Dam Jensen

October 7, 2021

One year after taking the post as executive director of the EuroHPC JU, Anders Dam Jensen reviews the project's accomplishments and details what's ahead as EuroHPC's operating period has now been extended out to the year 2027. Read more…

University of Bath Unveils Janus, an Azure-Based Cloud HPC Environment

October 6, 2021

The University of Bath is upgrading its HPC infrastructure, which it says “supports a growing and wide range of research activities across the University.” Read more…

Ahead of ‘Dojo,’ Tesla Reveals Its Massive Precursor Supercomputer

June 22, 2021

In spring 2019, Tesla made cryptic reference to a project called Dojo, a “super-powerful training computer” for video data processing. Then, in summer 2020, Tesla CEO Elon Musk tweeted: “Tesla is developing a [neural network] training computer... Read more…

Enter Dojo: Tesla Reveals Design for Modular Supercomputer & D1 Chip

August 20, 2021

Two months ago, Tesla revealed a massive GPU cluster that it said was “roughly the number five supercomputer in the world,” and which was just a precursor to Tesla’s real supercomputing moonshot: the long-rumored, little-detailed Dojo system. Read more…

Esperanto, Silicon in Hand, Champions the Efficiency of Its 1,092-Core RISC-V Chip

August 27, 2021

Esperanto Technologies made waves last December when it announced ET-SoC-1, a new RISC-V-based chip aimed at machine learning that packed nearly 1,100 cores onto a package small enough to fit six times over on a single PCIe card. Now, Esperanto is back, silicon in-hand and taking aim... Read more…

CentOS Replacement Rocky Linux Is Now in GA and Under Independent Control

June 21, 2021

The Rocky Enterprise Software Foundation (RESF) is announcing the general availability of Rocky Linux, release 8.4, designed as a drop-in replacement for the soon-to-be discontinued CentOS. The GA release is launching six-and-a-half months... Read more…

US Closes in on Exascale: Frontier Installation Is Underway

September 29, 2021

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, held by Zoom this week (Sept. 29-30), it was revealed that the Frontier supercomputer is currently being installed at Oak Ridge National Laboratory in Oak Ridge, Tenn. The staff at the Oak Ridge Leadership... Read more…

Intel Completes LLVM Adoption; Will End Updates to Classic C/C++ Compilers in Future

August 10, 2021

Intel reported in a blog this week that its adoption of the open source LLVM architecture for Intel’s C/C++ compiler is complete. The transition is part of In Read more…

Hot Chips: Here Come the DPUs and IPUs from Arm, Nvidia and Intel

August 25, 2021

The emergence of data processing units (DPU) and infrastructure processing units (IPU) as potentially important pieces in cloud and datacenter architectures was Read more…

AMD-Xilinx Deal Gains UK, EU Approvals — China’s Decision Still Pending

July 1, 2021

AMD’s planned acquisition of FPGA maker Xilinx is now in the hands of Chinese regulators after needed antitrust approvals for the $35 billion deal were receiv Read more…

Leading Solution Providers

Contributors

HPE Wins $2B GreenLake HPC-as-a-Service Deal with NSA

September 1, 2021

In the heated, oft-contentious, government IT space, HPE has won a massive $2 billion contract to provide HPC and AI services to the United States’ National Security Agency (NSA). Following on the heels of the now-canceled $10 billion JEDI contract (reissued as JWCC) and a $10 billion... Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

Quantum Roundup: IBM, Rigetti, Phasecraft, Oxford QC, China, and More

July 13, 2021

IBM yesterday announced a proof for a quantum ML algorithm. A week ago, it unveiled a new topology for its quantum processors. Last Friday, the Technical Univer Read more…

The Latest MLPerf Inference Results: Nvidia GPUs Hold Sway but Here Come CPUs and Intel

September 22, 2021

The latest round of MLPerf inference benchmark (v 1.1) results was released today and Nvidia again dominated, sweeping the top spots in the closed (apples-to-ap Read more…

Frontier to Meet 20MW Exascale Power Target Set by DARPA in 2008

July 14, 2021

After more than a decade of planning, the United States’ first exascale computer, Frontier, is set to arrive at Oak Ridge National Laboratory (ORNL) later this year. Crossing this “1,000x” horizon required overcoming four major challenges: power demand, reliability, extreme parallelism and data movement. Read more…

Intel Unveils New Node Names; Sapphire Rapids Is Now an ‘Intel 7’ CPU

July 27, 2021

What's a preeminent chip company to do when its process node technology lags the competition by (roughly) one generation, but outmoded naming conventions make i Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire