Editor’s note; The Day 1 and Day 2 reports from PEARC23 got crossed in the wires. Both reports are now posted. Thanks to Ken Chiacchia of the Pittsburgh Supercomputing Center for his great reports.
Plenary sessions on the second day of PEARC23 in Portland, Oregon, focused on applying cyberinfrastructure to the varied and complex needs of wildland firefighting and an update from the NSF Office of Advanced Cyberinfrastructure on their funding programs.
The annual PEARC conference series provides a forum for discussing challenges, opportunities, and solutions among the broad range of participants in the research computing community. Building on the successes of the past, the series aims to integrate and meet the collective interests of that growing community. The theme of PEARC23 is “Computing for the Common Good.”
Fighting Fires Using Data and Computing
“Convergent research” was the theme of Ilkay Altintas’s plenary presentation — integrating edge users into the scientific workflow so that, from the outset, cyberinfrastructure centers on practical applications in the field. As an example, she described efforts by the WIFIRE Lab within the Cyberinfrastructure and Convergence Research and Education Division (CICORE) at the San Diego Supercomputer Center, which she directs, to operationalize data and computation in wildland firefighting.
“When you think of convergence research, it’s built on other forms of research; so, it doesn’t happen in isolation,” Altintas said. In wildland fires, that means early, deep and early engagement by firefighters. “They together with us co-design the solution and become co-owners of the solution.”
The approach requires a change of focus from individual workflows to “teamflows,” she added. Wildland fires are typical among such projects in that they require a fusion of data, cyberinfrastructure, and machine learning to attack fires both immediately, as part of a firefighting response to a given blaze, and over time, as part of the planning and preventive measures needed to reduce risk long-term. The number of information sources, the complexity and varied provenance of the data, and the number and needs of different users all contribute to the scope of the problem.
The 2020 West Coast fires were devastating by any measure, Altintas noted. The fires burned 10 million acres — 4% of California’s land area — causing $16 billion in property damage, not counting the $3.5 billion cost of fighting them.
“Wildland fires are not always a problem,” she said. “Megafires are,” because the latter are of higher intensity, move more quickly, and cover more area. These events pose vastly higher risk to life and property, challenges in evacuating humans and livestock to safety, and of course difficulty in extinguishment.
The difference, Altintas said, lies in the distinction between “regular weather” and “fire weather” — conditions that favor the growth and intensity of a fire. Fires in normal conditions burn and are fought without making more than local news. On the other hand, fire weather conditions, fueled in California by the Santa Ana winds, represent many more acres burned and a greater, more rapidly moving threat.
Computational aids to firefighting must address distinct levels of complexity and time urgency. Firefighters trying to extinguish a wildland fire need tools that analyze the data rapidly, but don’t necessarily need high resolution; ecosystem sustainability and risk-management efforts require high detail but can tolerate more lengthy computation. Other firefighting interventions may need a specific balance of speed and resolution.
Data sources for fire interventions may include landscape/terrain data, vegetation cover and fuel, real-time fire perimeter observations, ground-based weather reports, and the results of weather forecasting and modeling. Challenges include poor data quality, provenance, and availability; lack of standards and transparency in agencies whose mission is to fight fire, not generate data for use by other agencies; and a vast volume of not-always-relevant data that overcomes decision making.
“We need fast solutions, and we need standardized and collaborative data infrastructure,” Altintas said.

WIFIRE Lab has over the last decade been building WIFIRE Commons, an infrastructure for reaching out to these data sources and federating them in a data model that is searchable and accessible. Other products of the center include:
- FiREMAP, a tool used by the state of California since 2016 for rapid, reactive solutions in fighting given fires
- BurnPro3D, a proactive tool for conducting prescribed burns, using controlled fires to clear fuel in a way that reduces the risk of fires growing and moving
- Another product, still in testing, a deep-learning-based approach to detecting smoke plumes
- Machine learning models developed on top of QUIC-Fire, a next-generation fire behavior model developed at LANL to bridge the gap between 2D fire perimeter images and a 3D model that improves resolution and adds accuracy in predicting fire behavior
“Emerging new applications require integrated AI in dynamically composed workflows,” Altintas said. “I think as a community [cyberinfrastructure professionals] need to start embracing this complexity and inviting others … to turn these solutions to societal-scale, sustainable solutions.”
Accessible, Inclusive, and Sustainable Cyberinfrastructure Ecosystem Development through NSF Support
In the second part of the day’s plenary session, Katie Antypas, NSF’s new Director of the Office of Advanced Cyberinfrastructure, headlined a panel of OAC’s program directors who surveyed the office’s programs.
Antypas introduced the OAC’s $252 million portfolio and its priorities. Responding to national initiatives and legislation, OAC is undertaking workshops to solicit community needs and feedback; engaging user communities to identify requirements; and responding to advice from external advisory committees, National Academy of Sciences reports, and other expert input.
“The changing user communities, technologies, vendors, business models and national landscape requires us to think deeply about our collective strategy for the future,” Antypas said.
She introduced The Office’s priorities for feedback from the community including advancing and interconnecting a broad Cyberinfrastructure (CI) ecosystem, developing human infrastructure in the form of workforce development, enabling scientific discovery through data, providing leadership in the planning of a National AI Research Resource, investing in and transitioning to new and innovative technologies, and developing partnerships – all with the aim of positioning the community for long-term U.S. leadership in research CI.
“[We need to] democratize the AI ecosystem to ensure that the research we do as a community is consistent with the values of the nation,” she said.
OAC’s program directors offered lightning summaries of their programs:
- Andrey Kanaev described his Advanced Computing System and Services program. The ACSS is requesting proposals from organizations willing to serve as resource providers of advanced CI capabilities or services that support the full range of computational and data intensive research across all of science and engineering. The program will fund Category 1 capacity systems up to a level of $10 million and Category 2 prototype testbeds to $5 million.
- Kanaev also discussed the Leadership-Class Computing Facility (LCCF), now in planning stages, led by the Texas Advanced Computing Center (TACC) with 27 academic partners and five distributed sites in the U.S.
- Lastly, he described the Major Research Instrumentation (MRI) program, a smaller-scale effort funding two classes of instruments, from $100,000 to $1.4 million and from $1.4 million to $4 million. A change for this year will be the prohibition, for the next five years, of formal cost-sharing in the funded projects. Formerly, cost-sharing had been required.
- Tom Gulbransen reminded attendees that the ACCESS program enables the research community to utilize an expanding array of large scale NSF-funded computational systems. Now entering its second year, ACCESS has roughly 5,000 direct users and 9,000 users via gateways, offering about 165 new allocations per month from 36 resource providers.
- Gulbransen also described OAC’s new Strengthening the Cyberinfrastructure Professionals Ecosystem (SCIPE) program, which offers up to four $15-million awards for institutions seeking to adopt service-model programs for professional development linked to specific scientific domains.
- Amy Apon talked about her Campus Cyberinfrastructure program, CC*. The program is funding campus-level grants to institutions seeking science-driven CI expansion in the context of campus CI plans (though grants for developing campus CI plans are also available). The program currently offers network awards in the campus, regional or innovation categories up to $650,000, $1.2 million, and $1 million, respectively; compute grants at campus and regional levels up to $500,000 and $1 million, respectively; and storage grants up to $500,000.
- Rob Beverly described CICI, the Cybersecurity Innovation for CyberInfrastructure program. CICI supports applied cybersecurity research in science data workflows, operationalization of emerging cybersecurity technologies into the science CI domain, and development of new cybersecurity approaches specific to that domain. CICI currently has several funding areas: projects for usable and collaborative security for science (UCSS), and reference scientific security datasets (RSSD) will be funded for up to 3 years and under $600,000; projects for Transition to CI Resilience will be funded for up to 3 years and under $1.2 million.
- Ashok Srinivasan discussed the Training-Based Workforce Development for Advanced CI (CyberTraining) program. The program prioritizes broad adoption of CI tools, methods, and resources; integration of CI and Computational and Data-Enabled Science and Engineering skills into undergraduate and graduate curricula; and broadened access and adoption of CI by varied institutions, scientific communities, and underrepresented groups. Project classes include two-year, $300,000 pilot projects, four-year, $500,000 small-implementation projects, and four year, $1 million medium-scale projects.
- Varun Chandola described the Cyberinfrastructure for Sustained Scientific Innovation (CSSI) program. CSSI seeks to develop a robust, reliable, and sustainable data-software cyberinfrastructure. The program currently supports three classes of projects: “Elements” projects from small groups are supported up to $600,000 over three years; “Framework” projects are larger, interdisciplinary efforts funded from $600,000 to $5 million over three to five years; and a new, “Transition to Sustainability” track supports the development of sustainability plans for existing CI projects at up to$1 Million for up to 2 years.
Bill Miller, OAC’s Senior Advisor for Cyberinfrastructure, discussed evolving federal policy for Public Access. In response to new guidance to federal agencies from the White House Office of Science and Technology Policy, in June NSF published its Public Access Plan 2.0 describing how the agency will encourage and require public reporting of research. New guidelines, to be issued in 2025, will require immediate public access to peer-reviewed reports and associated data arising from NSF rewards. More details can be found at https://new.nsf.gov/public-access.