ITIF Report Aims to Sway Congress, Promote National HPC Agenda

By John Russell

April 28, 2016

The Information Technology and Innovation Foundation (ITIF), a Washington D.C. think tank with close ties to the Office of Science and Technology Policy and government broadly, today released an expansive report – The Vital Importance of High- Performance Computing to U.S. Competitiveness – and also held a panel to discuss the report’s recommendation. Noteworthy, many of the panelists are familiar names in the HPC community.

All boiled down, the ITIF report is another call for national action in support of HPC, similar in tone to (and supportive of) the National Strategic Computing Initiative (NSCI). Calling HPC a strategic, game- changing technology with tremendous economic competitiveness, science leadership, and national security implications, the ITIF document treads familiar NSCI ground as shown this brief excerpt:

“Because HPC stands at the forefront of scientific discovery and commercial innovation, it is positioned at the frontier of competition—for nations and their enterprises alike—making U.S. strength in producing and adopting HPC central to its competitiveness. But as competitor nations rapidly scale up their investments in and applications of high-performance computing, America will need concerted public and private collaboration and investment to maintain its leading position in both HPC production and application.”

ITIF.KeynoterHow effective this latest call for a coordinated national HPC policy will be is an open question. ITIF’s bipartisan nature may help, say observers, and today’s keynoter before the panel was Republican Randy Hultgren, Representative (R-IL-14) U.S. Congress (shown on the right).

The scheduled panel featured prominent HPC organizations and senior personnel:

  • IntelJoseph Curley, Intel’s senior director, HPC Platform and Ecosystem Enablement in the High Performance Computing Platform Group.
  • Hewlett Packard EnterpriseBill Mannel, vice president and general manager of High-Performance Computing and Big Data.
  • IBM – David Turek, vice president, Exascale Systems.
  • National Renewable Energy Laboratory (NREL) – Steven Hammond, director of the Computational Science Center.
  • IDCRobert Sorensen, IDC research vice president, IDC High Performance Computing.
  • ITIFStephen Ezell, one of the report’s authors and vice president, Global Innovation Policy.

It’s good to recall the draft implementation plan for NSCI has yet to be made public. IDC’s Sorensen, who spent 30-plus years as a senior technology analyst in government said he was among many senior HPC watchers who were extensively interviewed by ITIF for the report, “They were very interested in getting the story right, and their credibility is good. But to be sure, they are not HPC experts, but they have good political cred here in DC.”

Turek of IBM said after the panel, “The report is well intentioned and mostly correct. [It] makes a compelling case that HPC is widely used and particularly beneficial, but who needs to read it to be convinced of it.” Turek wondered how best to handle the ‘operationalizing’ challenge which was not spelled out as clearly as the goals.

Unlike the NSCI executive order, the ITIF report presents a fair number of details and examples of the role and impact of HPC on industry as well a compilation of major global initiatives seeking to obtain HPC leadership. China’s developing plans – including expectations it will fire up two 100 petaflops computers this year – receives lengthy treatment.

ITIF HPC Programs Table

Excerpt: “Clearly, China has made HPC leadership a national priority. A key reason for this is that, for China, leadership in high-performance computing is central to the country’s goal of transitioning away from reliance on foreign technology to using homegrown technology. As Li Na, a spokesperson for the Tianhe-2 project, explains, “We are producing supercomputers with a fundamental purpose of providing a driving force for the construction of an innovation-oriented country.” As IDC’s Rajnish Arora explains, “The Chinese government and companies want to become the creators and not just producer of products that are being designed elsewhere.” Or, as Chinese President Xi Jinping himself puts it, China has built its HPC capabilities in part to demonstrate that the country has become a cyber power.”

ITIF_Report_CoverIn the release announcing the report, ITIF’s Ezell is quoted, “The U.S. is home to three of the five fastest supercomputers in the world, but China is home to the global frontrunner and plans to launch an even faster supercomputer this year. Japan and the EU have also introduced concerted national programs to achieve high-performance computing leadership. While America is still the world leader, other nations are gaining on us, so the U.S. cannot afford to rest on its laurels. It is important for policymakers to build on efforts the Obama administration has undertaken to ensure the U.S. does not get out paced.”

National plans/aspirations in Europe, Japan, Russia, South Korea, and India are also briefly reviewed. For those familiar with HPC, the full report is a fast read – despite its 50-plus-page length.

The ITIF report calls for energizing NSCI and the additional steps listed here:

Congress should:

  • Hold hearings on the National Strategic Computing Initiative (NSCI) and the intensifying race for global HPC leadership.
  • Authorize and appropriate funding levels for the National Strategic Computing Initiative as requested in the Obama administration’s FY 2017 budget for FY 2017 and for future years.
  • Reform export control regulations to match the reality of current high- performance computing systems.

The administration, or its agencies and departments therein, should:

  • Continue to make technology transfer and commercialization activities a priority focus of America’s network of national laboratories.
  • Emphasize HPC in federal worker training and retraining programs.
  • Emphasize HPC in relevant Manufacturing Extension Partnership (MEP) engagements, helping facilitate small- to medium-sized enterprises’ (SME) access to high-performance computing.

Turek made a few suggestions for modification or additions to the report’s policy recommendations. For example, with respect to NSCI, Turek said, “It would be appropriate formally to get leaders from American industry involved to establish the list of industry grand challenges so there is a direct linkage between the activities of NSCI and impact on competitiveness as opposed to thinking it is going to be accomplished through some sort of indirect trickle down effect. I also thought it would be motivating for people to come and work on these problems.”

The expansion of industrial use of HPC is certainly a central tenant of the report. It notes: “Finally, in February 2016, as part of its HPC4Mfg (HPC for Manufacturing) challenge, the Department of Energy announced $3 million in funding for 10 projects that will allow manufacturers to tap into the power of HPC systems at DOE-managed national laboratories.113 Each of the projects is designed to leverage HPC to improve efficiency, enhance product development, or reduce energy consumption. For example, one initiative will help Global Foundries optimize semiconductor transistor design, and in another GE will leverage advanced HPC particle physics simulations to improve the efficiency and lifespan of its aircraft engines.114 The vision is to grow this concept from just HPC4Mfg into an HPC4X template where the same process can be applied to HPC4transportation, HPC4life sciences, etc.”

The compilation of examples of HPC’s impact on industry is extensive and although many are widely known, it’s the scope of HPC’s current potential effect that is most interesting. Here are just three from the report, which is freely available online:

  • “Boeing physically tested 77 prototype wing designs for its 767 aircraft (which was designed in the 1980s), but for its new 787 Dreamliner, only 11 wing designs were physically tested (a 7-fold reduction in the needed amount of prototyping), primarily because over 800,000 hours of supercomputer simulations had drastically reduced the need for physical prototyping.5
  • “HPC has facilitated development of a cloud-based tool that simulates welding processes used in metallic product manufacturing. The application, being developed by the Ohio Supercomputer Center (OSC) and the Engineering Mechanics Corporation of Columbus, in part through a Small Business Innovation Research (SBIR) grant awarded by DOE, is a welding design software package called Virtual Fabrication Technology that enables SMEs to tap into HPC resources, so they can validate the integrity of welds in assembled components.85
  • “Roughly 4,100 genetic diseases affect humans, and they are the main causes of infant deaths. But identifying which genetic disease is affecting a critically ill child is extremely difficult. For one infant suffering from liver failure, the center used 25 hours of supercomputer time to analyze 120 billion nucleotide sequences and narrowed the cause of the illness down to two possible genetic variants. Thanks to this highly accurate diagnosis, the baby is alive and well today.”

One issue raised by Turek is the need to incentivize ISVs. The SME community highlighted in the report relies heavily on commercial codes, but these codes don’t scale well, and many ISVs have established positions of prominence in particular; as a result they have limited motivation to drive advances. Most of the codes were developed in the late 60s and 70s – often with Government funding at national labs – when no anticipated scaling would become the dominant computing paradigm.

Turek made two suggestions, “One, we should reengage ISVs on this issue of modern algorithm development and modern numerical methods in the context of emerging kind of hardware architectures we’re seeing. That would go a long way to helping overcome the scaling issues of today.” Number two, thought needs to be given to how best to “provide incentives to get these commercial ISVS to modernize their software, make it portable, and make it scalable, if there is not a market push to do this.”

If the incentives are there and commercial ISVs “don’t respond well, then redirect that money to the open source community and help them get better established,” said Turek.

Needed training and the potential for expanding the workforce are also covered. The report notes that HPC specifically is an important component of the broader computer manufacturing subsector that in the United States employs approximately 1 million individuals, 600,000 in production roles.34 In 2015, this employment included 28,370 computer hardware engineers; 22,570 semiconductor processors; 38,010 electrical and electronic engineering technicians; and 97,200 electronic and electronic equipment assemblers.

The report is peppered with statistics, quotes, and examples and is best read in full to form one’s own conclusion. It will be interesting to see if the ITIF report can generate real activity. Sorensen, who participated in the early development of NSCI noted, “I am really looking to see if this is a chance to get the Hill engaged.”

You can read the full report here and watch the video of the panel here.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

NASA Uses Supercomputing to Measure Carbon in the World’s Trees

October 22, 2020

Trees constitute one of the world’s most important carbon sinks, pulling enormous amounts of carbon dioxide from the atmosphere and storing the carbon in their trunks and the surrounding soil. Measuring this carbon sto Read more…

By Oliver Peckham

Nvidia Dominates (Again) Latest MLPerf Inference Results

October 22, 2020

The two-year-old AI benchmarking group MLPerf.org released its second set of inferencing results yesterday and again, as in the most recent MLPerf training results (July 2020), it was almost entirely The Nvidia Show, a p Read more…

By John Russell

With Optane Gaining, Intel Exits NAND Flash

October 21, 2020

In a sign that its 3D XPoint memory technology is gaining traction, Intel Corp. is departing the NAND flash memory and storage market with the sale of its manufacturing base in China to SK Hynix of South Korea. The $9 Read more…

By George Leopold

HPE, AMD and EuroHPC Partner for Pre-Exascale LUMI Supercomputer

October 21, 2020

Not even a week after Nvidia announced that it would be providing hardware for the first four of the eight planned EuroHPC systems, HPE and AMD are announcing another major EuroHPC design win. Finnish supercomputing cent Read more…

By Oliver Peckham

HPE to Build Australia’s Most Powerful Supercomputer for Pawsey

October 20, 2020

The Pawsey Supercomputing Centre in Perth, Western Australia, has had a busy year. Pawsey typically spends much of its time looking to the stars, working with a variety of observatories and astronomers – but when COVID Read more…

By Oliver Peckham

AWS Solution Channel

Live Webinar: AWS & Intel Research Webinar Series – Fast scaling research workloads on the cloud

Date: 27 Oct – 5 Nov

Join us for the AWS and Intel Research Webinar series.

You will learn how we help researchers process complex workloads, quickly analyze massive data pipelines, store petabytes of data, and advance research using transformative technologies. Read more…

Intel® HPC + AI Pavilion

Berlin Institute of Health: Putting HPC to Work for the World

Researchers from the Center for Digital Health at the Berlin Institute of Health (BIH) are using science to understand the pathophysiology of COVID-19, which can help to inform the development of targeted treatments. Read more…

DDN-Tintri Showcases Technology Integration with Two New Products

October 20, 2020

DDN, a long-time leader in HPC storage, announced two new products today and provided more detail around its strategy for integrating DDN HPC technologies with the enterprise strengths of its recent acquisitions, notably Read more…

By John Russell

Nvidia Dominates (Again) Latest MLPerf Inference Results

October 22, 2020

The two-year-old AI benchmarking group MLPerf.org released its second set of inferencing results yesterday and again, as in the most recent MLPerf training resu Read more…

By John Russell

HPE, AMD and EuroHPC Partner for Pre-Exascale LUMI Supercomputer

October 21, 2020

Not even a week after Nvidia announced that it would be providing hardware for the first four of the eight planned EuroHPC systems, HPE and AMD are announcing a Read more…

By Oliver Peckham

HPE to Build Australia’s Most Powerful Supercomputer for Pawsey

October 20, 2020

The Pawsey Supercomputing Centre in Perth, Western Australia, has had a busy year. Pawsey typically spends much of its time looking to the stars, working with a Read more…

By Oliver Peckham

DDN-Tintri Showcases Technology Integration with Two New Products

October 20, 2020

DDN, a long-time leader in HPC storage, announced two new products today and provided more detail around its strategy for integrating DDN HPC technologies with Read more…

By John Russell

Is the Nvidia A100 GPU Performance Worth a Hardware Upgrade?

October 16, 2020

Over the last decade, accelerators have seen an increasing rate of adoption in high-performance computing (HPC) platforms, and in the June 2020 Top500 list, eig Read more…

By Hartwig Anzt, Ahmad Abdelfattah and Jack Dongarra

Nvidia and EuroHPC Team for Four Supercomputers, Including Massive ‘Leonardo’ System

October 15, 2020

The EuroHPC Joint Undertaking (JU) serves as Europe’s concerted supercomputing play, currently comprising 32 member states and billions of euros in funding. I Read more…

By Oliver Peckham

ROI: Is HPC Worth It? What Can We Actually Measure?

October 15, 2020

HPC enables innovation and discovery. We all seem to agree on that. Is there a good way to quantify how much that’s worth? Thanks to a sponsored white pape Read more…

By Addison Snell, Intersect360 Research

Preparing for Exascale Science on Day 1

October 14, 2020

Science simulation, visualization, data, and learning applications will greatly benefit from the massive computational resources available with future exascal Read more…

By Linda Barney

Supercomputer-Powered Research Uncovers Signs of ‘Bradykinin Storm’ That May Explain COVID-19 Symptoms

July 28, 2020

Doctors and medical researchers have struggled to pinpoint – let alone explain – the deluge of symptoms induced by COVID-19 infections in patients, and what Read more…

By Oliver Peckham

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

Google Hires Longtime Intel Exec Bill Magro to Lead HPC Strategy

September 18, 2020

In a sign of the times, another prominent HPCer has made a move to a hyperscaler. Longtime Intel executive Bill Magro joined Google as chief technologist for hi Read more…

By Tiffany Trader

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Aurora’s Troubles Move Frontier into Pole Exascale Position

October 1, 2020

Intel’s 7nm node delay has raised questions about the status of the Aurora supercomputer that was scheduled to be stood up at Argonne National Laboratory next year. Aurora was in the running to be the United States’ first exascale supercomputer although it was on a contemporaneous timeline with... Read more…

By Tiffany Trader

European Commission Declares €8 Billion Investment in Supercomputing

September 18, 2020

Just under two years ago, the European Commission formalized the EuroHPC Joint Undertaking (JU): a concerted HPC effort (comprising 32 participating states at c Read more…

By Oliver Peckham

Leading Solution Providers

Contributors

Is the Nvidia A100 GPU Performance Worth a Hardware Upgrade?

October 16, 2020

Over the last decade, accelerators have seen an increasing rate of adoption in high-performance computing (HPC) platforms, and in the June 2020 Top500 list, eig Read more…

By Hartwig Anzt, Ahmad Abdelfattah and Jack Dongarra

Nvidia and EuroHPC Team for Four Supercomputers, Including Massive ‘Leonardo’ System

October 15, 2020

The EuroHPC Joint Undertaking (JU) serves as Europe’s concerted supercomputing play, currently comprising 32 member states and billions of euros in funding. I Read more…

By Oliver Peckham

Google Cloud Debuts 16-GPU Ampere A100 Instances

July 7, 2020

On the heels of the Nvidia’s Ampere A100 GPU launch in May, Google Cloud is announcing alpha availability of the A100 “Accelerator Optimized” VM A2 instance family on Google Compute Engine. The instances are powered by the HGX A100 16-GPU platform, which combines two HGX A100 8-GPU baseboards using... Read more…

By Tiffany Trader

Microsoft Azure Adds A100 GPU Instances for ‘Supercomputer-Class AI’ in the Cloud

August 19, 2020

Microsoft Azure continues to infuse its cloud platform with HPC- and AI-directed technologies. Today the cloud services purveyor announced a new virtual machine Read more…

By Tiffany Trader

Oracle Cloud Infrastructure Powers Fugaku’s Storage, Scores IO500 Win

August 28, 2020

In June, RIKEN shook the supercomputing world with its Arm-based, Fujitsu-built juggernaut: Fugaku. The system, which weighs in at 415.5 Linpack petaflops, topp Read more…

By Oliver Peckham

DOD Orders Two AI-Focused Supercomputers from Liqid

August 24, 2020

The U.S. Department of Defense is making a big investment in data analytics and AI computing with the procurement of two HPC systems that will provide the High Read more…

By Tiffany Trader

Oracle Cloud Deepens HPC Embrace with Launch of A100 Instances, Plans for Arm, More 

September 22, 2020

Oracle Cloud Infrastructure (OCI) continued its steady ramp-up of HPC capabilities today with a flurry of announcements. Topping the list is general availabilit Read more…

By John Russell

Joliot-Curie Supercomputer Used to Build First Full, High-Fidelity Aircraft Engine Simulation

July 14, 2020

When industrial designers plan the design of a new element of a vehicle’s propulsion or exterior, they typically use fluid dynamics to optimize airflow and in Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This