PEARC23 Day Two Plenary: Firefighting and Funding

By Ken Chiacchia, Pittsburgh Supercomputing Center 

July 27, 2023

Editor’s note; The Day 1 and Day 2 reports from PEARC23 got crossed in the wires. Both reports are now posted. Thanks to Ken Chiacchia of the Pittsburgh Supercomputing Center for his great reports.  

Plenary sessions on the second day of PEARC23 in Portland, Oregon, focused on applying cyberinfrastructure to the varied and complex needs of wildland firefighting and an update from the NSF Office of Advanced Cyberinfrastructure on their funding programs. 

The annual PEARC conference series provides a forum for discussing challenges, opportunities, and solutions among the broad range of participants in the research computing community. Building on the successes of the past, the series aims to integrate and meet the collective interests of that growing community. The theme of PEARC23 is “Computing for the Common Good.” 

Fighting Fires Using Data and Computing 

“Convergent research” was the theme of Ilkay Altintas’s plenary presentation — integrating edge users into the scientific workflow so that, from the outset, cyberinfrastructure centers on practical applications in the field. As an example, she described efforts by the WIFIRE Lab within the Cyberinfrastructure and Convergence Research and Education Division (CICORE) at the San Diego Supercomputer Center, which she directs, to operationalize data and computation in wildland firefighting. 

“When you think of convergence research, it’s built on other forms of research; so, it doesn’t happen in isolation,” Altintas said. In wildland fires, that means early, deep and early engagement by firefighters. “They together with us co-design the solution and become co-owners of the solution.” 

The approach requires a change of focus from individual workflows to “teamflows,” she added. Wildland fires are typical among such projects in that they require a fusion of data, cyberinfrastructure, and machine learning to attack fires both immediately, as part of a firefighting response to a given blaze, and over time, as part of the planning and preventive measures needed to reduce risk long-term. The number of information sources, the complexity and varied provenance of the data, and the number and needs of different users all contribute to the scope of the problem. 

The 2020 West Coast fires were devastating by any measure, Altintas noted. The fires burned 10 million acres — 4% of California’s land area — causing $16 billion in property damage, not counting the $3.5 billion cost of fighting them. 

“Wildland fires are not always a problem,” she said. “Megafires are,” because the latter are of higher intensity, move more quickly, and cover more area. These events pose vastly higher risk to life and property, challenges in evacuating humans and livestock to safety, and of course difficulty in extinguishment. 

The difference, Altintas said, lies in the distinction between “regular weather” and “fire weather” — conditions that favor the growth and intensity of a fire. Fires in normal conditions burn and are fought without making more than local news. On the other hand, fire weather conditions, fueled in California by the Santa Ana winds, represent many more acres burned and a greater, more rapidly moving threat. 

Computational aids to firefighting must address distinct levels of complexity and time urgency. Firefighters trying to extinguish a wildland fire need tools that analyze the data rapidly, but don’t necessarily need high resolution; ecosystem sustainability and risk-management efforts require high detail but can tolerate more lengthy computation. Other firefighting interventions may need a specific balance of speed and resolution. 

Data sources for fire interventions may include landscape/terrain data, vegetation cover and fuel, real-time fire perimeter observations, ground-based weather reports, and the results of weather forecasting and modeling. Challenges include poor data quality, provenance, and availability; lack of standards and transparency in agencies whose mission is to fight fire, not generate data for use by other agencies; and a vast volume of not-always-relevant data that overcomes decision making. 

“We need fast solutions, and we need standardized and collaborative data infrastructure,” Altintas said. 

WIFIRE Logo
WIFIRE Lab is an all hazards knowledge cyberinfrastructure project based at SDSC.

WIFIRE Lab has over the last decade been building WIFIRE Commons, an infrastructure for reaching out to these data sources and federating them in a data model that is searchable and accessible. Other products of the center include: 

  • FiREMAP, a tool used by the state of California since 2016 for rapid, reactive solutions in fighting given fires 
  • BurnPro3D, a proactive tool for conducting prescribed burns, using controlled fires to clear fuel in a way that reduces the risk of fires growing and moving 
  • Another product, still in testing, a deep-learning-based approach to detecting smoke plumes 
  • Machine learning models developed on top of QUIC-Fire, a next-generation fire behavior model developed at LANL to bridge the gap between 2D fire perimeter images and a 3D model that improves resolution and adds accuracy in predicting fire behavior 

“Emerging new applications require integrated AI in dynamically composed workflows,” Altintas said. “I think as a community [cyberinfrastructure professionals] need to start embracing this complexity and inviting others … to turn these solutions to societal-scale, sustainable solutions.” 

Accessible, Inclusive, and Sustainable Cyberinfrastructure Ecosystem Development through NSF Support 

In the second part of the day’s plenary session, Katie Antypas, NSF’s new Director of the Office of Advanced Cyberinfrastructure, headlined a panel of OAC’s program directors who surveyed the office’s programs. 

Antypas introduced the OAC’s $252 million portfolio and its priorities. Responding to national initiatives and legislation, OAC is undertaking workshops to solicit community needs and feedback; engaging user communities to identify requirements; and responding to advice from external advisory committees, National Academy of Sciences reports, and other expert input. 

“The changing user communities, technologies, vendors, business models and national landscape requires us to think deeply about our collective strategy for the future,” Antypas said. 

She introduced The Office’s priorities for feedback from the community including advancing and interconnecting a broad Cyberinfrastructure (CI) ecosystem, developing human infrastructure in the form of workforce development, enabling scientific discovery through data, providing leadership in the planning of a National AI Research Resource, investing in and transitioning to new and innovative technologies, and developing partnerships – all with the aim of positioning the community for long-term U.S. leadership in research CI. 

“[We need to] democratize the AI ecosystem to ensure that the research we do as a community is consistent with the values of the nation,” she said. 

OAC’s program directors offered lightning summaries of their programs: 

  • Andrey Kanaev described his Advanced Computing System and Services program. The ACSS is requesting proposals from organizations willing to serve as resource providers of advanced CI capabilities or services that support the full range of computational and data intensive research across all of science and engineering. The program will fund Category 1 capacity systems up to a level of $10 million and Category 2 prototype testbeds to $5 million. 
  • Kanaev also discussed the Leadership-Class Computing Facility (LCCF), now in planning stages, led by the Texas Advanced Computing Center (TACC) with 27 academic partners and five distributed sites in the U.S. 
  • Lastly, he described the Major Research Instrumentation (MRI) program, a smaller-scale effort funding two classes of instruments, from $100,000 to $1.4 million and from $1.4 million to $4 million. A change for this year will be the prohibition, for the next five years, of formal cost-sharing in the funded projects. Formerly, cost-sharing had been required. 
  • Tom Gulbransen reminded attendees that the ACCESS program enables the research community to utilize   an expanding array of large scale NSF-funded computational systems. Now entering its second year, ACCESS has roughly 5,000 direct users and 9,000 users via gateways, offering about 165 new allocations per month from 36 resource providers. 
  • Gulbransen also described OAC’s new Strengthening the Cyberinfrastructure Professionals Ecosystem (SCIPE) program, which offers up to four $15-million awards for institutions seeking to adopt service-model programs for professional development linked to specific scientific domains. 
  • Amy Apon talked about her Campus Cyberinfrastructure program, CC*. The program is funding campus-level grants to institutions seeking science-driven CI expansion in the context of campus CI plans (though grants for developing campus CI plans are also available). The program currently offers network awards in the campus, regional or innovation categories up to $650,000, $1.2 million, and $1 million, respectively; compute grants at campus and regional levels up to $500,000 and $1 million, respectively; and storage grants up to $500,000. 
  • Rob Beverly described CICI, the Cybersecurity Innovation for CyberInfrastructure program. CICI supports applied cybersecurity research in science data workflows, operationalization of emerging cybersecurity technologies into the science CI domain, and development of new cybersecurity approaches specific to that domain. CICI currently has several funding areas: projects for usable and collaborative security for science (UCSS), and reference scientific security datasets (RSSD) will be funded for up to 3 years and under $600,000; projects for Transition to CI Resilience will be funded for up to 3 years and under $1.2 million. 
  • Ashok Srinivasan discussed the Training-Based Workforce Development for Advanced CI (CyberTraining) program. The program prioritizes broad adoption of CI tools, methods, and resources; integration of CI and Computational and Data-Enabled Science and Engineering skills into undergraduate and graduate curricula; and broadened access and adoption of CI by varied institutions, scientific communities, and underrepresented groups. Project classes include two-year, $300,000 pilot projects, four-year, $500,000 small-implementation projects, and four year, $1 million medium-scale projects. 
  • Varun Chandola described the Cyberinfrastructure for Sustained Scientific Innovation (CSSI) program. CSSI seeks to develop a robust, reliable, and sustainable data-software cyberinfrastructure. The program currently supports three classes of projects: “Elements” projects from small groups are supported up to $600,000 over three years; “Framework” projects are larger, interdisciplinary efforts funded from $600,000 to $5 million over three to five years; and a new, “Transition to Sustainability” track supports the development of sustainability plans for existing CI projects at up to$1 Million for up to 2 years. 

Bill Miller, OAC’s Senior Advisor for Cyberinfrastructure, discussed evolving federal policy for Public Access. In response to new guidance to federal agencies from the White House Office of Science and Technology Policy, in June NSF published its Public Access Plan 2.0 describing how the agency will encourage and require public reporting of research. New guidelines, to be issued in 2025, will require immediate public access to peer-reviewed reports and associated data arising from NSF rewards. More details can be found at https://new.nsf.gov/public-access. 

 

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

HPC User Forum: Sustainability at TACC Points to Software

October 3, 2023

Recently, Dan Stanzione, Executive Director, TACC and Associate Vice President for Research, UT-Austin, gave a presentation on HPC sustainability at the Fall 2023 HPC Users Forum. The complete set of slides is available Read more…

Google’s Controversial AI Chip Paper Under Scrutiny Again 

October 3, 2023

A controversial research paper by Google that claimed the superiority of AI techniques in creating chips is under the microscope for the authenticity of its claims. Science publication Nature is investigating Google's c Read more…

Rust Busting: IBM and Boeing Battle Corrosion with Simulations on Quantum Computer

October 3, 2023

The steady research into developing real-world applications for quantum computing is piling up interesting use cases. Today, IBM reported on work with Boeing to simulate corrosion processes to improve composites used in Read more…

Nvidia Delivering New Options for MLPerf and HPC Performance

September 28, 2023

As HPCwire reported recently, the latest MLperf benchmarks are out. Not unsurprisingly, Nvidia was the leader across many categories. The HGX H100 GPU systems, which contain eight H100 GPUs, delivered the highest throughput on every MLPerf inference test in this round. Read more…

Hakeem Oluseyi Explores His Unlikely Journey from the Street to the Stars in SC23 Keynote

September 28, 2023

Defying the odds In the heart of one of the toughest neighborhoods in the country, young Hakeem Oluseyi’s world was a confined space, but his imagination soared to the stars. While other kids roamed the streets, he Read more…

AWS Solution Channel

Shutterstock 2338659951

VorTech Derisks Innovative Technology to Aid Global Water Sustainability Challenges Using Cloud-Native Simulations on AWS

Overview

No more than 1 percent of the world’s water is readily available fresh water, according to the US Geological Survey. Read more…

QCT Solution Channel

QCT and Intel Codeveloped QCT DevCloud Program to Jumpstart HPC and AI Development

Organizations and developers face a variety of issues in developing and testing HPC and AI applications. Challenges they face can range from simply having access to a wide variety of hardware, frameworks, and toolkits to time spent on installation, development, testing, and troubleshooting which can lead to increases in cost. Read more…

Nvidia Takes Another Shot at Trying to Get AI to Mobile Devices

September 28, 2023

Nvidia takes another shot at trying to get to mobile devices Long before the current situation of Nvidia's GPUs holding AI hostage, the company tried to put its chips in mobile devices but failed. The Tegra mobile chi Read more…

Shutterstock 1927423355

Google’s Controversial AI Chip Paper Under Scrutiny Again 

October 3, 2023

A controversial research paper by Google that claimed the superiority of AI techniques in creating chips is under the microscope for the authenticity of its cla Read more…

Rust Busting: IBM and Boeing Battle Corrosion with Simulations on Quantum Computer

October 3, 2023

The steady research into developing real-world applications for quantum computing is piling up interesting use cases. Today, IBM reported on work with Boeing to Read more…

Nvidia Delivering New Options for MLPerf and HPC Performance

September 28, 2023

As HPCwire reported recently, the latest MLperf benchmarks are out. Not unsurprisingly, Nvidia was the leader across many categories. The HGX H100 GPU systems, which contain eight H100 GPUs, delivered the highest throughput on every MLPerf inference test in this round. Read more…

IonQ Announces 2 New Quantum Systems; Suggests Quantum Advantage is Nearing

September 27, 2023

It’s been a busy week for IonQ, the quantum computing start-up focused on developing trapped-ion-based systems. At the Quantum World Congress today, the compa Read more…

Rethinking ‘Open’ for AI

September 27, 2023

What does “open” mean in the context of AI? Must we accept hidden layers? Do copyrights and patents still hold sway? And do consumers have the right to opt Read more…

Aurora Image

Leveraging Machine Learning in Dark Matter Research for the Aurora Exascale System 

September 25, 2023

Scientists have unlocked many secrets about particle interactions at atomic and subatomic levels. However, one mystery that has eluded researchers is dark matte Read more…

Watsonx Brings AI Visibility to Banking Systems

September 21, 2023

A new set of AI-based code conversion tools is available with IBM watsonx. Before introducing the new "watsonx," let's talk about the previous generation Watson Read more…

Intel’s Gelsinger Lays Out Vision and Map at Innovation 2023 Conference

September 20, 2023

Intel’s sprawling, optimistic vision for the future was on full display yesterday in CEO Pat Gelsinger’s opening keynote at the Intel Innovation 2023 confer Read more…

CORNELL I-WAY DEMONSTRATION PITS PARASITE AGAINST VICTIM

October 6, 1995

Ithaca, NY --Visitors to this year's Supercomputing '95 (SC'95) conference will witness a life-and-death struggle between parasite and victim, using virtual Read more…

SGI POWERS VIRTUAL OPERATING ROOM USED IN SURGEON TRAINING

October 6, 1995

Surgery simulations to date have largely been created through the development of dedicated applications requiring considerable programming and computer graphi Read more…

U.S. Will Relax Export Restrictions on Supercomputers

October 6, 1995

New York, NY -- U.S. President Bill Clinton has announced that he will definitely relax restrictions on exports of high-performance computers, giving a boost Read more…

Dutch HPC Center Will Have 20 GFlop, 76-Node SP2 Online by 1996

October 6, 1995

Amsterdam, the Netherlands -- SARA, (Stichting Academisch Rekencentrum Amsterdam), Academic Computing Services of Amsterdam recently announced that it has pur Read more…

Cray Delivers J916 Compact Supercomputer to Solvay Chemical

October 6, 1995

Eagan, Minn. -- Cray Research Inc. has delivered a Cray J916 low-cost compact supercomputer and Cray's UniChem client/server computational chemistry software Read more…

NEC Laboratory Reviews First Year of Cooperative Projects

October 6, 1995

Sankt Augustin, Germany -- NEC C&C (Computers and Communication) Research Laboratory at the GMD Technopark has wrapped up its first year of operation. Read more…

Sun and Sybase Say SQL Server 11 Benchmarks at 4544.60 tpmC

October 6, 1995

Mountain View, Calif. -- Sun Microsystems, Inc. and Sybase, Inc. recently announced the first benchmark results for SQL Server 11. The result represents a n Read more…

New Study Says Parallel Processing Market Will Reach $14B in 1999

October 6, 1995

Mountain View, Calif. -- A study by the Palo Alto Management Group (PAMG) indicates the market for parallel processing systems will increase at more than 4 Read more…

Leading Solution Providers

Contributors

CORNELL I-WAY DEMONSTRATION PITS PARASITE AGAINST VICTIM

October 6, 1995

Ithaca, NY --Visitors to this year's Supercomputing '95 (SC'95) conference will witness a life-and-death struggle between parasite and victim, using virtual Read more…

SGI POWERS VIRTUAL OPERATING ROOM USED IN SURGEON TRAINING

October 6, 1995

Surgery simulations to date have largely been created through the development of dedicated applications requiring considerable programming and computer graphi Read more…

U.S. Will Relax Export Restrictions on Supercomputers

October 6, 1995

New York, NY -- U.S. President Bill Clinton has announced that he will definitely relax restrictions on exports of high-performance computers, giving a boost Read more…

Dutch HPC Center Will Have 20 GFlop, 76-Node SP2 Online by 1996

October 6, 1995

Amsterdam, the Netherlands -- SARA, (Stichting Academisch Rekencentrum Amsterdam), Academic Computing Services of Amsterdam recently announced that it has pur Read more…

Cray Delivers J916 Compact Supercomputer to Solvay Chemical

October 6, 1995

Eagan, Minn. -- Cray Research Inc. has delivered a Cray J916 low-cost compact supercomputer and Cray's UniChem client/server computational chemistry software Read more…

NEC Laboratory Reviews First Year of Cooperative Projects

October 6, 1995

Sankt Augustin, Germany -- NEC C&C (Computers and Communication) Research Laboratory at the GMD Technopark has wrapped up its first year of operation. Read more…

Sun and Sybase Say SQL Server 11 Benchmarks at 4544.60 tpmC

October 6, 1995

Mountain View, Calif. -- Sun Microsystems, Inc. and Sybase, Inc. recently announced the first benchmark results for SQL Server 11. The result represents a n Read more…

New Study Says Parallel Processing Market Will Reach $14B in 1999

October 6, 1995

Mountain View, Calif. -- A study by the Palo Alto Management Group (PAMG) indicates the market for parallel processing systems will increase at more than 4 Read more…

ISC 2023 Booth Videos

Cornelis Networks @ ISC23
Dell Technologies @ ISC23
Intel @ ISC23
Lenovo @ ISC23
Microsoft @ ISC23
ISC23 Playlist
  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire