C3.ai Digital Transformation Institute Announces AI for Energy and Climate Security Grantees

June 10, 2021

URBANA, Ill. and BERKELEY, Calif., June 10, 2021 — C3.ai Digital Transformation Institute (C3.ai DTI) today announced the second round of C3.ai DTI awards, focused on using artificial intelligence (AI) techniques and digital transformation to advance energy efficiency and lead the way to a lower-carbon, higher-efficiency economy that will ensure energy and climate security.

C3.ai DTI issued this call for proposals in February 2021, and received 52 submissions. A rigorous peer review process led to 21 awards for research proposals to improve resilience, sustainability, and efficiency through such measures as carbon sequestration, carbon markets, hydrocarbon production, distributed renewables, and cybersecurity, among other topics.

The Institute awarded a total of $4.4 million in cash from this call for proposals, the second call the Institute has released since the organization’s launch in March 2020. In addition to cash awards, research teams gain access to up to $2 million in Azure Cloud computing resources, up to 800,000 supercomputing node hours on the Blue Waters petascale supercomputer at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign, up to 25 million computing hours on supercomputers at Lawrence Berkeley National Laboratory’s National Energy Research Scientific Computing Center (NERSC), and free, unlimited access to the C3 AI Suite hosted on the Microsoft Azure Cloud.

The 21 projects were each awarded $100,000 to $250,000, for an initial period of one year, in one of nine categories, as listed below by project title, principal investigator, and affiliation.

  • Sustainability – Applying AI, machine learning, and advanced analytics to support sustainability initiatives for energy consumption and greenhouse gas emissions:
    • Learning in Routing Games for Sustainable Electromobility (Henrik Sandberg, KTH Royal Institute of Technology)
    • AI-Driven Materials Discovery Framework for Energy-Efficient and Sustainable Electrochemical Separations (Xiao Su, University of Illinois Urbana-Champaign)
  • AI for Carbon Sequestration – Applying AI/ML techniques to increase the scale and reduce costs of carbon sequestration:
    • Optimization of Agricultural Management for Soil Carbon Sequestration Using Deep Reinforcement Learning and Large-Scale Simulations (Naira Hovakimyan, University of Illinois at Urbana-Champaign)
    • Affordable Gigaton-Scale Carbon Sequestration: Navigating Autonomous Seaweed Growth Platforms by Leveraging Complex Ocean Currents and Machine Learning (Claire Tomlin, University of California, Berkeley)
  • AI for Advanced Energy and Carbon Markets – Enabling dynamic, automated, and real-time pricing of energy-generation sources:
    • Quantifying Carbon Credit Over the U.S. Midwestern Cropland Using AI-Based Data-Model Fusion (Kaiyu Guan, University of Illinois at Urbana-Champaign)
    • The Role of Interconnectivity and Strategic Behavior in Electric Power System Reliability (Ali Hortacsu, University of Chicago)
  • Cybersecurity of Power and Energy Infrastructure – Leveraging AI/ML techniques to improve the cybersecurity of critical power and energy assets, along with smart connected factories and homes:
    • Private Cyber-Secure Data-Driven Control of Distributed Energy Resources (Subhonmesh Bose, University of Illinois at Urbana-Champaign)
    • Cyberattacks and Anomalies for Power Systems: Defense Mechanism and Grid Fortification via Machine Learning Techniques (Javad Lavaei, University of California, Berkeley)
    • A Joint ML+Physics-Driven Approach for Cyber-Attack Resilience in Grid Energy Management (Amritanshu Pandey, Carnegie Mellon University)
  • Smart Grid Analytics – Applying AI and other analytic approaches to improve the efficiency and effectiveness of grid transmission and distribution operations:
    • Scalable Data-Driven Voltage Control of Ultra-Large-Scale Power Networks (Alejandro Dominguez-Garcia, University of Illinois at Urbana-Champaign)
    • Offline Reinforcement Learning for Energy-Efficient Power Grids (Sergey Levine, University of California, Berkeley)
  • Distributed Energy Resource Management – Applying AI to increase the penetration and use of distributed renewables:
    • Machine Learning for Power Electronics-Enabled Power Systems: A Unified ML Platform for Power Electronics, Power Systems, and Data Science (Minjie Chen, Princeton University)
    • Sharing Mobile Energy Storage: Platforms and Learning Algorithms (Kameshwar Poolla, University of California, Berkeley)
    • Data-Driven Control and Coordination of Smart Converters for Sustainable Power System Using Deep Reinforcement Learning (Qianwen Xu, KTH Royal Institute of Technology)
  • AI for Improved Natural Catastrophe Risk Assessment – Applying AI to improve modeling of natural catastrophe risks from future weather-related events (e.g., tropical storms, wildfires, and floods):
    • AI for Natural Catastrophes: Tropical Cyclone Modeling and Enabling the Resilience Paradigm (Arindam Banerjee, University of Illinois at Urbana-Champaign)
    • Multi-Scale Analysis for Improved Risk Assessment of Wildfires Facilitated by Data and Computation (Marta Gonzalez, University of California, Berkeley)
  • Resilient Energy Systems – Addressing how the use of AI/ML techniques and markets for energy and carbon introduce new vulnerabilities:
    • A Learning-Based Influence Model Approach to Cascading Failure Prediction (Eytan Modiano, Massachusetts Institute of Technology)
    • Reinforcement Learning for a Resilient Electric Power System (Alberto Sangiovanni-Vincentelli, University of California, Berkeley)
  • AI for Improved Climate Change Modeling – Use of AI/ML to address climate change modeling and adaptation:
    • Machine Learning to Reduce Uncertainty in the Effects of Fires on Climate (Hamish Gordon, Carnegie Mellon University)
    • AI-Based Prediction of Urban Climate and Its Impact on Built Environments
      (Wei Liu, KTH Royal Institute of Technology)
    • Interpretable Machine Learning Models to Improve Forecasting of Extreme-Weather-Causing Tropical Monster Storms (Da Yang, Lawrence Berkeley National Laboratory)

“From wildfires to rising seas to monster storms crippling our energy systems, increasingly extreme weather clearly represents a severe threat to our economy, infrastructure, and national security,” said S. Shankar Sastry, C3.ai DTI Co-Director and Thomas M. Siebel Professor of Computer Science at the University of California, Berkeley. “Improving climate resilience will require profound changes powered by a new era of technology like those C3.ai DTI is supporting today.”

“A number of energy companies and utilities have used enterprise AI to transform their operations, but as we can see, there’s an even greater need for resilience to cyberattacks and large environmental disruptions,” said R. Srikant, C3.ai DTI Co-Director and Fredric G. and Elizabeth H. Nearing Endowed Professor of Electrical and Computer Engineering at the University of Illinois at Urbana-Champaign. “These projects are designed with those goals in mind.”

Award Criteria

C3.ai DTI selects research proposals that inspire cooperative research and advance machine learning and other AI subdisciplines. Projects are peer-reviewed on the basis of scientific merit, prior accomplishments of the principal investigator and co-principal investigators, the use of AI, machine learning, data analytics, and cloud computing in the research project, and the suitability for testing the methods at scale. Visit C3DTI.ai to learn more about the Institute’s programs, award opportunities, and selected research proposals.

About C3.ai Digital Transformation Institute

Established in March 2020 by C3 AI, Microsoft, and leading universities, the C3.ai Digital Transformation Institute is a research consortium dedicated to accelerating the benefits of artificial intelligence for business, government, and society. The Institute engages the world’s leading scientists to conduct research and train practitioners in the new Science of Digital Transformation – operating at the intersection of artificial intelligence, machine learning, cloud computing, internet of things, big data analytics, organizational behavior, public policy, and ethics.

The ten C3.ai Digital Transformation Institute consortium member universities and laboratories are: University of California, Berkeley, University of Illinois at Urbana-Champaign, Carnegie Mellon University, KTH Royal Institute of Technology, Lawrence Berkeley National Laboratory, Massachusetts Institute of Technology, National Center for Supercomputing Applications at University of Illinois at Urbana-Champaign, Princeton University, Stanford University, and University of Chicago. Additional industry partners include AstraZeneca, Baker Hughes, and Shell.

To support the Institute, C3 AI is providing the Institute $57,250,000 in cash contributions over the first five years of operation. C3 AI and Microsoft will contribute an additional $310 million of in-kind support, including use of the C3 AI Suite and Microsoft Azure computing, storage, and technical resources to support C3.ai DTI research.

About C3.ai, Inc.

C3.ai, Inc. (NYSE:AI) is an Enterprise AI application software company that accelerates digital transformation for organizations globally. C3 AI delivers a family of fully integrated products: C3 AI Suite, an end-to-end platform for developing, deploying, and operating large-scale AI applications; C3 AI Applications, a portfolio of industry-specific SaaS AI applications; C3 AI CRM, a suite of industry-specific CRM applications designed for AI and machine learning; and C3 AI Ex Machina, a no-code AI solution to apply data science to everyday business problems. The core of the C3 AI offering is an open, model-driven AI architecture that dramatically simplifies data science and application development. Learn more at: www.c3.ai.


Source: C3.ai, Inc.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Q&A with Google’s Bill Magro, an HPCwire Person to Watch in 2021

June 11, 2021

Last Fall Bill Magro joined Google as CTO of HPC, a newly created position, after two decades at Intel, where he was responsible for the company's HPC strategy. This interview was conducted by email at the beginning of A Read more…

A Carbon Crisis Looms Over Supercomputing. How Do We Stop It?

June 11, 2021

Supercomputing is extraordinarily power-hungry, with many of the top systems measuring their peak demand in the megawatts due to powerful processors and their correspondingly powerful cooling systems. As a result, these Read more…

Honeywell Quantum and Cambridge Quantum Plan to Merge; More to Follow?

June 10, 2021

Earlier this week, Honeywell announced plans to merge its quantum computing business, Honeywell Quantum Solutions (HQS), which focuses on trapped ion hardware, with the U.K.-based Cambridge Quantum Computing (CQC), which Read more…

ISC21 Keynoter Xiaoxiang Zhu to Deliver a Bird’s-Eye View of a Changing World

June 10, 2021

ISC High Performance 2021 – once again virtual due to the ongoing pandemic – is swiftly approaching. In contrast to last year’s conference, which canceled its in-person component with a couple months’ notice, ISC Read more…

Xilinx Expands Versal Chip Family With 7 New Versal AI Edge Chips

June 10, 2021

FPGA chip vendor Xilinx has been busy over the last several years cranking out its Versal AI Core, Versal Premium and Versal Prime chip families to fill customer compute needs in the cloud, datacenters, networks and more. Now Xilinx is expanding its reach to the booming edge... Read more…

AWS Solution Channel

Building highly-available HPC infrastructure on AWS

Reminder: You can learn a lot from AWS HPC engineers by subscribing to the HPC Tech Short YouTube channel, and following the AWS HPC Blog channel. Read more…

Space Weather Prediction Gets a Supercomputing Boost

June 9, 2021

Solar winds are a hot topic in the HPC world right now, with supercomputer-powered research spanning from the Princeton Plasma Physics Laboratory (which used Oak Ridge’s Titan system) to University College London (which used resources from the DiRAC HPC facility). One of the larger... Read more…

A Carbon Crisis Looms Over Supercomputing. How Do We Stop It?

June 11, 2021

Supercomputing is extraordinarily power-hungry, with many of the top systems measuring their peak demand in the megawatts due to powerful processors and their c Read more…

Honeywell Quantum and Cambridge Quantum Plan to Merge; More to Follow?

June 10, 2021

Earlier this week, Honeywell announced plans to merge its quantum computing business, Honeywell Quantum Solutions (HQS), which focuses on trapped ion hardware, Read more…

ISC21 Keynoter Xiaoxiang Zhu to Deliver a Bird’s-Eye View of a Changing World

June 10, 2021

ISC High Performance 2021 – once again virtual due to the ongoing pandemic – is swiftly approaching. In contrast to last year’s conference, which canceled Read more…

Xilinx Expands Versal Chip Family With 7 New Versal AI Edge Chips

June 10, 2021

FPGA chip vendor Xilinx has been busy over the last several years cranking out its Versal AI Core, Versal Premium and Versal Prime chip families to fill customer compute needs in the cloud, datacenters, networks and more. Now Xilinx is expanding its reach to the booming edge... Read more…

What is Thermodynamic Computing and Could It Become Important?

June 3, 2021

What, exactly, is thermodynamic computing? (Yes, we know everything obeys thermodynamic laws.) A trio of researchers from Microsoft, UC San Diego, and Georgia Tech have written an interesting viewpoint in the June issue... Read more…

AMD Introduces 3D Chiplets, Demos Vertical Cache on Zen 3 CPUs

June 2, 2021

At Computex 2021, held virtually this week, AMD showcased a new 3D chiplet architecture that will be used for future high-performance computing products set to Read more…

Nvidia Expands Its Certified Server Models, Unveils DGX SuperPod Subscriptions

June 2, 2021

Nvidia is busy this week at the virtual Computex 2021 Taipei technology show, announcing an expansion of its nascent Nvidia-certified server program, a range of Read more…

Using HPC Cloud, Researchers Investigate the COVID-19 Lab Leak Hypothesis

May 27, 2021

At the end of 2019, strange pneumonia cases started cropping up in Wuhan, China. As Wuhan (then China, then the world) scrambled to contain what would, of cours Read more…

AMD Chipmaker TSMC to Use AMD Chips for Chipmaking

May 8, 2021

TSMC has tapped AMD to support its major manufacturing and R&D workloads. AMD will provide its Epyc Rome 7702P CPUs – with 64 cores operating at a base cl Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

Berkeley Lab Debuts Perlmutter, World’s Fastest AI Supercomputer

May 27, 2021

A ribbon-cutting ceremony held virtually at Berkeley Lab's National Energy Research Scientific Computing Center (NERSC) today marked the official launch of Perlmutter – aka NERSC-9 – the GPU-accelerated supercomputer built by HPE in partnership with Nvidia and AMD. Read more…

CERN Is Betting Big on Exascale

April 1, 2021

The European Organization for Nuclear Research (CERN) involves 23 countries, 15,000 researchers, billions of dollars a year, and the biggest machine in the worl Read more…

Google Launches TPU v4 AI Chips

May 20, 2021

Google CEO Sundar Pichai spoke for only one minute and 42 seconds about the company’s latest TPU v4 Tensor Processing Units during his keynote at the Google I Read more…

Iran Gains HPC Capabilities with Launch of ‘Simorgh’ Supercomputer

May 18, 2021

Iran is said to be developing domestic supercomputing technology to advance the processing of scientific, economic, political and military data, and to strengthen the nation’s position in the age of AI and big data. On Sunday, Iran unveiled the Simorgh supercomputer, which will deliver.... Read more…

HPE Launches Storage Line Loaded with IBM’s Spectrum Scale File System

April 6, 2021

HPE today launched a new family of storage solutions bundled with IBM’s Spectrum Scale Erasure Code Edition parallel file system (description below) and featu Read more…

Quantum Computer Start-up IonQ Plans IPO via SPAC

March 8, 2021

IonQ, a Maryland-based quantum computing start-up working with ion trap technology, plans to go public via a Special Purpose Acquisition Company (SPAC) merger a Read more…

Leading Solution Providers

Contributors

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

AMD Launches Epyc ‘Milan’ with 19 SKUs for HPC, Enterprise and Hyperscale

March 15, 2021

At a virtual launch event held today (Monday), AMD revealed its third-generation Epyc “Milan” CPU lineup: a set of 19 SKUs -- including the flagship 64-core, 280-watt 7763 part --  aimed at HPC, enterprise and cloud workloads. Notably, the third-gen Epyc Milan chips achieve 19 percent... Read more…

Can Deep Learning Replace Numerical Weather Prediction?

March 3, 2021

Numerical weather prediction (NWP) is a mainstay of supercomputing. Some of the first applications of the first supercomputers dealt with climate modeling, and Read more…

Livermore’s El Capitan Supercomputer to Debut HPE ‘Rabbit’ Near Node Local Storage

February 18, 2021

A near node local storage innovation called Rabbit factored heavily into Lawrence Livermore National Laboratory’s decision to select Cray’s proposal for its CORAL-2 machine, the lab’s first exascale-class supercomputer, El Capitan. Details of this new storage technology were revealed... Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

Microsoft to Provide World’s Most Powerful Weather & Climate Supercomputer for UK’s Met Office

April 22, 2021

More than 14 months ago, the UK government announced plans to invest £1.2 billion ($1.56 billion) into weather and climate supercomputing, including procuremen Read more…

African Supercomputing Center Inaugurates ‘Toubkal,’ Most Powerful Supercomputer on the Continent

February 25, 2021

Historically, Africa hasn’t exactly been synonymous with supercomputing. There are only a handful of supercomputers on the continent, with few ranking on the Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire