President’s Council Targets AI, Quantum, STEM; Recommends Spending Growth

By John Russell

July 9, 2020

Last week the President Council of Advisors on Science and Technology (PCAST) met (webinar) to review policy recommendations around three sub-committee reports: 1) Industries of the Future (IotF), chaired be Dario Gil (director of research, IBM); 2) Meeting STEM Education and Workforce Needs, chaired by Catherine Bessant (CTO, Bank of America), and 3) New Models of Engagement for Federal/National Laboratories in the Multi-Sector R&D Enterprise, chaired by Dr. A.N. Sreeram (SVP, CTO, Dow Corp.)

Yesterday, the full report (Recommendations For Strengthening American Leadership In Industries Of The Future) was issued and it is fascinating and wide-ranging. To give you a sense of the scope, here are three highlights taken from the executive summary of the full report:

  • With regard to the first pillar, Federal agencies need to take full advantage of their administrative authorities to partner with industry and academia in new and innovative ways, particularly to ensure the effective transition and translation of early-stage research outcomes into applications at scale. In the area of AI, this includes establishing a joint AI Fellow-in-Residence program, AI Research Institutes in all 50 States, National AI Testbeds, partnerships for curating and sharing large datasets, and joint international programs for attracting and retaining the best global talent, and research and development (R&D) and training for trustworthy AI.
  • The second pillar of this report homes in on a new model for leveraging the strength of America’s National Laboratories to enhance and accelerate substantial front-to-back progress in IotF. The cornerstone recommendation involves establishing a new type of world-class, multi-sector R&D institute that catalyzes innovation at all stages of R&D—from discovery research to development, deployment, and commercialization of new technologies. These highly prestigious “IotF Institutes” would support portfolios of collaborative projects at the intersection of two or more IotF pillars, and be structured to minimize burdensome administrative overhead so as to maximize rapid progress. They would utilize innovative intellectual property terms that incentivize participation by industry, academia, and non-profits as a means for driving commercialization of IotF technologies at scale.
  • Achieving success with the first two pillars of this report rests upon the Nation’s ability to strengthen, grow, and diversify its science, technology, engineering, and mathematics (STEM) workforce at all levels—from skilled technical workers to researchers with advanced degrees. First and foremost, America must build the Workforce of the Future by creating STEM training and education opportunities for individuals from all backgrounds, STEM and non-STEM, including underrepresented and underserved populations. Employers, academic institutions, professional societies, and other partners should develop programs to provide non-STEM workers with professional competencies that will grant them a role in the STEM Workforce of the Future. Public- and private-sector employers should be recruited to pledge and realize support for hiring newly skilled STEM workers, especially those from non-traditional backgrounds, into STEM positions.

The devil will be in the details. Summer, of course, is the time when diverse public and private groups working on policy – in this case technology policy – formulate their ideas in more concrete forms for consideration by policy-makers. The just-released PCAST report is one example. Recently proposed legislation to remake NSF is another example (See HPCwire article, $100B Plan Submitted for Massive Remake and Expansion of NSF). It’s difficult to gauge the impact of competing proposals at this stage.

Loosely, PCAST provides advice to the President on science and technology issued. It has existed in various forms with varying names stretching back to Teddy Roosevelt. PCAST in its current form dates to President George H. W. Bush who renamed the group and switched it from reporting to White House science advisor to reporting directly to the President. PCAST lives in the Office of Science and Technology and Kelvin Droegemeier, the director of OSTP also chairs PCAST. Droegemeier presided at the most recent meeting.

Current PCAST efforts emphasize expanding industry participation in key technologies including closer collaboration with government and academia. Formally, the PCAST report is the work of three sub-committees: American Global Leadership in Industries of the Future (Gil); Meeting National Needs for STEM Education and a Diverse, Multi-Sector Workforce (Bessant); and New Models of Engagement for Federal and National Laboratories in the Multi-Sector R&D Enterprise (Sreeram).

Perhaps the most relevant, near-term proposal for the HPC community is the report led by Gil whose webinar presentation focused largely on AI and quantum science. Here’s a snippet from Gil’s prepared remarks at the webinar.

Dario Gil, IBM Research Director

“It is time to scale AI. Throughout all our conversations with leaders across federal agencies, universities, nonprofits and industry, it is clear that AI is at the top of the list of technologies that can make the biggest difference to the health prosperity and security of our nation…That is why we’re recommending a 10x R&D investment growth over 10 years in AI research and applied institutes in all 50 states [and] an unwavering commitment to invest in developing and attracting the best talent into this field and industry,” said Gil.

The report notes “industry will invest more than $2 billion between 2020–2025 to design, build, and deploy high-availability quantum computing systems, execute a roadmap that will at least double system performance every year, and build cloud-accessible quantum computational centers and associated services” and recommends a Federal investment of $100 million/year for the next 5 years to create quantum computing user facilities leveraging the output of the multi-billion-dollar investments that industry is undertaking designing and building quantum computing systems.”

Broadly, Gil noted, “The mission of our subcommittee is to collaborate on an action plan for ensuring American leadership in the industries of the future, which include quantum information sciences, artificial intelligence, advanced wireless communications, advanced manufacturing and biotechnology.

“We were asked and tasked with identifying and making recommendations on strategic steps to help bridge critical gaps and augment and strengthen existing federal actions [and] to identify new opportunities and some strategic actions that can be taken to accelerate the industries, and recommend strategies for enhancing cross sector and international cooperation. Our recommendations today are focused on AI and quantum computing, two rapidly advancing technologies and also on the opportunities that are present in their convergence.”

Much of the material is familiar to the HPC community. Given the added funding already promised for AI and quantum, getting more may prove to be a tough sell in the current climate. However, given that PCAST’s charge presumably stems directly from the President’s office, it will be interesting to watch how its recommendations fare.

Gil set the stage, “Over the last decade, powered by exponential growth in computing power, and ever increasing availability of data, technological breakthroughs in AI are enabling intelligent systems to take on increasingly sophisticated tasks and augmenting human capabilities in a new and profound way. It is undoubtedly the case that AI has emerged as one of the most important technologies of our era. And in recent months, during the COVID-19 crisis, AI has demonstrated critical capabilities as well as important potential for the future.”

“We recommend sustained investment growth of a billion dollars per year in non-defense research funding through 2030 as described in the paper,” he said. Noting, the 2020 budget request for AI within NSF is $487 million, he said, “[B]ased on the number of highly rated proposals that currently go unfunded and are of equal marriage to those which are funded. PCAST anticipates that growing the investment $1 billion a year would allow for making at least 1000 additional awards to individual investigators without any loss of quality.”

Here’s a somewhat extended excerpt from Gil’s prepared remarks with more detail on AI recommendations:

“It is important to create a virtuous cycle aimed at the innovation infrastructure itself to continuously accelerate R&D in AI. The COVID-19 High Performance Computing consortium and the coordinating team data sets are examples of the enormous value of creating platforms for sharing data and computational resources for accelerating efforts in science and technology related to the crisis. The creation of a national research cloud currently being considered by Congress is another good example. We therefore recommend the creation of national AI testbeds, by first securing us industry investment pledges support for AI infrastructure. This would include grants to provide compute infrastructure for research and education related to AI, including things like free cloud credits and high performance computing cluster donations to universities, open source AI frameworks, libraries and tools to democratize access to the latest advances in AI.

“[We] recommend expanding the ongoing NSF based programs to establish national AI research centers and infrastructure with sustained long-term funding to enable cross-cutting research and technology transitions. These centers would enable research on core and applied AI in addition to the AI research institutes that we discussed a minute ago. Also, applied AI Institutes [could] focus on things like agriculture or AI for manufacturing, as well as cross-cutting AI topics such as AI for social good [such as] future of work and harnessing big data. It is quite clear that AI is going to touch every area of science. We therefore recommend directing the AI science mission at the National Labs and across federal agencies to drive the technical foundation for performing scientific research with AI powered methods.

“And because data is the fuel for AI, we recommend tasking federal agents such as NIST and NIH to curate, manage and disseminate large data sets across critical areas for AI applications, working across us agencies, industry partners, and other stakeholders. We cannot emphasize enough how critical it is to get data AI-ready. Let’s remember that 80 percent of the effort of any AI project is typically spent on the data curation and preparation.”

There is a proposal in the report to use AI-driven workflow to improve research and discovery productivity as shown in the figure below.

Here are a few more slides with a some of PCAST’s proposed AI activities taken from the webinar. It’s best to consult the full report.

HPCwire will provide further coverage of the report later.

Link to the full report, Recommendations For Strengthening American Leadership In Industries Of The Future, https://science.osti.gov/-/media/_/pdf/about/pcast/202006/PCAST_June_2020_Report.pdf?la=en&hash=019A4F17C79FDEE5005C51D3D6CAC81FB31E3ABC 

Link to webinar agenda and meeting all three slide presentations: https://science.osti.gov/About/PCAST/Meetings/202006

 

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Research: A Survey of Numerical Methods Utilizing Mixed Precision Arithmetic

August 5, 2020

Within the past years, hardware vendors have started designing low precision special function units in response to the demand of the machine learning community and their demand for high compute power in low precision for Read more…

By Hartwig Anzt and Jack Dongarra

Implement Photonic Tensor Cores for Machine Learning?

August 5, 2020

Researchers from George Washington University have reported an approach for building photonic tensor cores that leverages phase change photonic memory to implement a neural network (NN). Their novel architecture, reporte Read more…

By John Russell

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing the pinnacle of HPE's HPC portfolio. After announcing its i Read more…

By Tiffany Trader

Machines, Connections, Data, and Especially People: OAC Acting Director Amy Friedlander Charts Office’s Blueprint for Innovation

August 3, 2020

The path to innovation in cyberinfrastructure (CI) will require continued focus on building HPC systems and secure connections between them, in addition to the increasingly important goals of data best practices and work Read more…

By Ken Chiacchia, Pittsburgh Supercomputing Center/XSEDE

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated, analysts said the acquisition would cement Nvidia’s stat Read more…

By George Leopold

AWS Solution Channel

AWS announces the release of AWS ParallelCluster 2.8.0

AWS ParallelCluster is a fully supported and maintained open source cluster management tool that makes it easy for scientists, researchers, and IT administrators to deploy and manage High Performance Computing (HPC) clusters in the AWS cloud. Read more…

Intel® HPC + AI Pavilion

Supercomputing the Pandemic: Scientific Community Tackles COVID-19 from Multiple Perspectives

Since their inception, supercomputers have taken on the biggest, most complex, and most data-intensive computing challenges—from confirming Einstein’s theories about gravitational waves to predicting the impacts of climate change. Read more…

Summer Reading: Here’s a Quantum Advantage You Can Bet On!

August 3, 2020

While quantum computing researchers today vigorously chase a demonstration of a quantum advantage – an application which when run on a quantum computer provides sufficient advantage to warrant switching from a classica Read more…

By John Russell

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

Machines, Connections, Data, and Especially People: OAC Acting Director Amy Friedlander Charts Office’s Blueprint for Innovation

August 3, 2020

The path to innovation in cyberinfrastructure (CI) will require continued focus on building HPC systems and secure connections between them, in addition to the Read more…

By Ken Chiacchia, Pittsburgh Supercomputing Center/XSEDE

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

PEARC20 Plenary Introduces Five Upcoming NSF-Funded HPC Systems

July 30, 2020

Five new HPC systems—three National Science Foundation-funded “Capacity” systems and two “Innovative Prototype/Testbed” systems—will be coming onlin Read more…

By Ken Chiacchia, Pittsburgh Supercomputing Center/XSEDE

Nvidia Dominates Latest MLPerf Training Benchmark Results

July 29, 2020

MLPerf.org released its third round of training benchmark (v0.7) results today and Nvidia again dominated, claiming 16 new records. Meanwhile, Google provided e Read more…

By John Russell

$39 Billion Worldwide HPC Market Faces 3.7% COVID-related Drop in 2020

July 29, 2020

Global HPC market revenue reached $39 billion in 2019, growing a healthy 8.2 percent over 2018, according to the latest analysis from Intersect360 Research. A 3 Read more…

By Tiffany Trader

Agenting Change: PEARC20 Keynote Encourages Cultural Change to Make Tech Better, More Diverse

July 29, 2020

The tech world will need to become more diverse if it is to thrive and survive, said Cherri Pancake, director of the Northwest Alliance for Computational Resear Read more…

By Ken Chiacchia, Pittsburgh Supercomputing Center/XSEDE

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

Supercomputer-Powered Research Uncovers Signs of ‘Bradykinin Storm’ That May Explain COVID-19 Symptoms

July 28, 2020

Doctors and medical researchers have struggled to pinpoint – let alone explain – the deluge of symptoms induced by COVID-19 infections in patients, and what Read more…

By Oliver Peckham

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer

June 9, 2020

Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…

By Tiffany Trader

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

Leading Solution Providers

Contributors

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Joliot-Curie Supercomputer Used to Build First Full, High-Fidelity Aircraft Engine Simulation

July 14, 2020

When industrial designers plan the design of a new element of a vehicle’s propulsion or exterior, they typically use fluid dynamics to optimize airflow and in Read more…

By Oliver Peckham

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

John Martinis Reportedly Leaves Google Quantum Effort

April 21, 2020

John Martinis, who led Google’s quantum computing effort since establishing its quantum hardware group in 2014, has left Google after being moved into an advi Read more…

By John Russell

Google Cloud Debuts 16-GPU Ampere A100 Instances

July 7, 2020

On the heels of the Nvidia’s Ampere A100 GPU launch in May, Google Cloud is announcing alpha availability of the A100 “Accelerator Optimized” VM A2 instance family on Google Compute Engine. The instances are powered by the HGX A100 16-GPU platform, which combines two HGX A100 8-GPU baseboards using... Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This