CCC Offers Draft 20-Year AI Roadmap; Seeks Comments

By John Russell

May 14, 2019

Artificial Intelligence in all its guises has captured much of the conversation in HPC and general computing today. The White House, DARPA, IARPA, and Department of Energy all have issued strategies or undertaken programs intended to foster AI development and use. Yesterday, the Computing Community Consortium (CCC) weighed in with a 100-plus page draft report – A 20-Year Community Roadmap for Artificial Intelligence Research in the US – and CCC is seeking comment around its concepts and recommendations.

The CCC, of course, is a body formed to “define the large-scale infrastructure needs of the computing research community” that was created in response to a National Science Foundation (NSF) solicitation in 2006. In turn, the CCC is part of the Computing Research Association (CRA) founded in 1972 and encompassing academia, industry, and government; its proposals, among other things, help inform NSF activities and federal computing priorities.

As noted on the CCC website, “The CCC Council meets three times every calendar year, including at least one meeting in Washington, D.C., and has biweekly conference calls between these meetings. Also, the CCC leadership has biweekly conference calls with the leadership of NSF’s Directorate for Computer and Information Science and Engineering (CISE).”

CCC began work on the new AI roadmap last fall, held three workshops and a ‘Town Hall’ meeting in 2019, and yesterday issued a blog calling for comment on its roadmap. Comments are due by May 28, 2019.

Honestly, parsing such a large document is best done by directly reading it and CCC has packed its AI roadmap with all manner of observation and suggestion. Here are its major recommendations excerpted from the bog:

I – Create and Operate a National AI Infrastructure to serve academia, industry, andgovernment through four interlocking capabilities:

a) Open AI platforms and resources: a vast interlinked distributed collection of “AI-ready” resources (curated high-quality datasets, software libraries, knowledge repositories, instrumented homes and hospitals, robotics environments, cloud-scale computing services, etc.) contributed by and available to the academic research community, as well as to industry and government. Recent major innovations from companies demonstrate that AI breakthroughs require large-scale hardware investments and open-source software infrastructures, both of which require substantial ongoing investments.

b) Sustained community-driven AI challenges: organizational structures that coordinate the formulation of grand-challenge problems by AI and domain experts to drive research in key areas, building upon—and adding to—the shared resources in the Open AI Platforms and Facilities.

c) National AI Research Centers:physical and virtual facilities that bring together Faculty Fellows from a range of academic institutions and Industry Fellows from industry and government in multi-year funded projects focused on pivotal areas of long-term AI research.

d) Mission-Driven AI Laboratories:living laboratories that provide sustained infrastructure, facilities, and human resources to support the Open AI Platforms and the AI Challenges, and work closely with the National AI Research Centers to integrate results to address critical AI challenges in vertical sectors of public interest such as health, education, policy, ethics, and science.

II – Re-conceptualize and Train an All-Encompassing AI Workforce, building upon the elements of the National AI Infrastructure listed above to create:

a) Development of AI Curricula at All Levels: guidelines should be developed for curricula that encourage early and ongoing interest in and understanding of AI, beginning in K-12 and extending through graduate courses and professional programs.

b) Recruitment and Retention Programs for Advanced AI Degrees: including grants for talented students to obtain advanced graduate degrees, retention programs for doctoral-level researchers, and additional resources to support and enfranchise AI teaching faculty.

c) Engaging Underrepresented and Underprivileged Groups: programs to bring the best talent into the AI research effort.

d) Incentivizing Emerging Interdisciplinary AI Areas: initiatives to encourage students and the research community to work in interdisciplinary AI studies—e.g., AI-related policy and law, AI safety engineering, as well as analysis of the impact of AI on society—will ensure a workforce and a research ecosystem that understands the full context for AI solutions.

e) Training Highly Skilled AI Engineers and Technicians, to support and build upon the Open AI Platform to grow the AI pipeline through community colleges, workforce retraining programs, certificate programs, and online degrees.

III – Core Programs for AI Research are critical.  These new resources and initiatives cannot come at the expense of existing programs for funding theoretical and applied AI. These core programs—which provide well-established, broad-based support for research progress, for training young researchers, for integrating AI research and education, and for nucleating novel interdisciplinary collaborations—are critical complements to the broader initiatives described in this Roadmap, and they too will require expanded support.

As you can see, there’s a lot here and that includes calling for increased spending in the context of a global race for AI. The report declares, “U.S. leadership in AI is at risk without significant, strategic investments, new models for infrastructure and resources, and attention to the education and training pipeline. Other major industrialized countries are already embarking on substantial AI research programs.

  • The EU has announced funding of 20B Euros for AI17 and is currently evaluating proposals for decadal-long 1B Eur0 science projects, one of them in the area of AI assistants. Germany and France have allocated 3B and 1.5B Euros to AI, respectively. The UK has pledged an investment of 1B Pounds in AI, together with dedicated funding for 1,000 PhDs and 8,000 specialized teachers in AI, and has repurposed its flagship Turing Institutes into major data-driven AI research centers.
  • China has announced that it will invest billions in AI over the next five years, creating at least four $50M/year AI Centers and a $1B/year National AI Research laboratory with thousands of AI researchers and engineers, and committing to training 500 instructors and 5,000 students at major universities…”

The scope of these efforts, argues CCC, “are in line with major U.S. research investments in the past, such as the LIGO project ($1.1B), the Human Genome project ($2.7B), and the Apollo program ($144B), all of which not only led to major scientific advances, but also produced significant economic and societal benefits.”

The recommended national Pivot AI Research Centers (PAIRCs) intended to create unique and stable environments for large multi-disciplinary teams devoted to long-term AI research aren’t cheap. The report says, “Each PAIRC would be funded in the range of $100M/year for at least 10 years. With this level of funding, a PAIRC would be able to support an ecosystem of roughly 100 full-time faculty (in AI and other relevant disciplines), 50 visiting fellows (faculty and industry), 200 AI engineers, and 500 students (graduate and undergraduate), and sufficient computing and infrastructure support.”

By way of analogy, the report notes, “There are a few examples of AI research centers that have long-term funding. The University of Maryland’s Center for the Study of Language (CASL), founded in 2003 as a DoD- sponsored University Affiliated Research Center (UARC) funded by the National Security Agency, includes about 60 researchers and 70 visitors from academia and industry focused on natural language research with a defense focus.”

How the report translates into NSF or further US-funded AI activities, of course, remains to be seen.

Link to CCC blog: https://www.cccblog.org/2019/05/13/request-comments-on-draft-a-20-year-community-roadmap-for-ai-research-in-the-us/?utm_source=feedblitz&utm_medium=FeedBlitzRss&utm_campaign=cccblog

Link to draft AI Roadmap: https://cra.org/ccc/wp-content/uploads/sites/2/2019/05/AIRoadmapDraftforCommunityMay2019.pdf

Link to comment form: https://computingresearch.wufoo.com/forms/s15u6ssf15mvnlg/

Figures all taken from the draft report

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, code-named Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from its predecessors, including the red-hot H100 and A100 GPUs. Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. While Nvidia may not spring to mind when thinking of the quant Read more…

2024 Winter Classic: Meet the HPE Mentors

March 18, 2024

The latest installment of the 2024 Winter Classic Studio Update Show features our interview with the HPE mentor team who introduced our student teams to the joys (and potential sorrows) of the HPL (LINPACK) and accompany Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the field was normalized for boys in 1969 when the Apollo 11 missi Read more…

Apple Buys DarwinAI Deepening its AI Push According to Report

March 14, 2024

Apple has purchased Canadian AI startup DarwinAI according to a Bloomberg report today. Apparently the deal was done early this year but still hasn’t been publicly announced according to the report. Apple is preparing Read more…

Survey of Rapid Training Methods for Neural Networks

March 14, 2024

Artificial neural networks are computing systems with interconnected layers that process and learn from data. During training, neural networks utilize optimization algorithms to iteratively refine their parameters until Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, code-named Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the fi Read more…

Survey of Rapid Training Methods for Neural Networks

March 14, 2024

Artificial neural networks are computing systems with interconnected layers that process and learn from data. During training, neural networks utilize optimizat Read more…

PASQAL Issues Roadmap to 10,000 Qubits in 2026 and Fault Tolerance in 2028

March 13, 2024

Paris-based PASQAL, a developer of neutral atom-based quantum computers, yesterday issued a roadmap for delivering systems with 10,000 physical qubits in 2026 a Read more…

India Is an AI Powerhouse Waiting to Happen, but Challenges Await

March 12, 2024

The Indian government is pushing full speed ahead to make the country an attractive technology base, especially in the hot fields of AI and semiconductors, but Read more…

Charles Tahan Exits National Quantum Coordination Office

March 12, 2024

(March 1, 2024) My first official day at the White House Office of Science and Technology Policy (OSTP) was June 15, 2020, during the depths of the COVID-19 loc Read more…

AI Bias In the Spotlight On International Women’s Day

March 11, 2024

What impact does AI bias have on women and girls? What can people do to increase female participation in the AI field? These are some of the questions the tech Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Analyst Panel Says Take the Quantum Computing Plunge Now…

November 27, 2023

Should you start exploring quantum computing? Yes, said a panel of analysts convened at Tabor Communications HPC and AI on Wall Street conference earlier this y Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Training of 1-Trillion Parameter Scientific AI Begins

November 13, 2023

A US national lab has started training a massive AI brain that could ultimately become the must-have computing resource for scientific researchers. Argonne N Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire