The National Science Foundation (NSF) made another big investment* this week with the launch of the Institute for Research and Innovation in Software for High-Energy Physics (IRIS-HEP) to help ensure that physicists have the tools needed to study and analyze the extreme data coming off the High-Luminosity Large Hadron Collider (HL-LHC) at CERN in Geneva, Switzerland. Backed by $25 million in NSF funding spread out over five years, the coalition will create next-generation cyberinfrastructure to address the unprecedented data generated by the upgraded Large Hadron Collider, advancing scientific understanding of particles such as the Higgs boson, which was first observed in 2012.
Led by Princeton University, IRIS-HEP includes multidisciplinary teams spanning 17 universities. The effort’s primary focus will be developing innovative software and training the next generation of users.
By the time the HL-LHC reaches full capability in 2026, it will produce more than 1 billion particle collisions every second. Only a very few of these overlapping particle collisions will be interesting candidates for further study. With ten times the luminosity of its predecessor, the fully-upgraded collider will drive an even greater increase in data processing and storage requirements. CERN and collaborating scientists are working on the tools necessary to capture, sort and record only the most relevant events as part of managing this needle in a haystack problem.
“High-energy physics had a rush of discoveries and advancements in the 1960s and 1970s that led to the Standard Model of particle physics, and the Higgs boson was the last missing piece of that puzzle,” said Princeton University computational physicist Peter Elmer, the principal investigator for the institute and a CERN researcher. “We are now searching for the next layer of physics beyond the Standard Model. The software institute will be key to getting us there. Primarily about people, rather than computing hardware, it will be an intellectual hub for community-wide software research and development, bringing researchers together to develop the powerful new software tools, algorithms and system designs that will allow us to explore high-luminosity LHC data and make discoveries.”
“[T]o fully explore this data, we need much more powerful software tools and algorithms,” said Elmer. “We also need to maximally exploit the evolving high-performance computing landscape and new tools like machine learning, in which computers study existing data sets to learn rules that they can apply to new data and new situations.”
The upgrade process from the Large Hadron Collider to the High Luminosity LHC officially commenced on June 15, 2018.
Below: Peter Elmer of Princeton University explains how the Large Hadron Collider is a Higgs boson factory, and why hunting through its data is like looking for a needle in a haystack.
Sources:
https://www.nsf.gov/news/news_summ.jsp?cntn_id=296456&WT.mc_id=USNSF_51&WT.mc_ev=click
Feature image caption: A design for the Inner Tracker of the ATLAS detector, one of the hardware upgrades planned for the HL-LHC. Credit: ATLAS Experiment © 2018 CERN
*Also see:
TACC’s ‘Frontera’ Supercomputer Expands Horizon for Extreme-Scale Science
The NSF 2026 Idea Machine Seeks Good (and Big) Ideas
NSF Ramps Quantum Investment, Tabs up to $25M for Q-AMASE-i Foundry
Funding for PSC’s Bridges Supercomputer Extended by NSF
STAQ(ing) the Quantum Computing Deck