Aaron Dubrow of the Texas Advanced Computing Center has written a brief history of NSF supercomputing efforts appearing the Huffington Post this week. 2016, of course, is the thirtieth anniversary for the original NFS-backed supercomputing centers and to mark the milestone Dubrow calls out thirty scientific advances that have been achieved through use of NSF-supported supercomputers (the most recent ten are bulleted below.)
The five original NSF-funded SC centers, very familiar to most in the HPC community, include:
- The National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign;
- The Pittsburgh Supercomputing Center (PSC) at Carnegie Mellon University and the University of Pittsburgh;
- The San Diego Supercomputer Center (SDSC) at the University of California, San Diego;
- The Cornell Theory Center at Cornell University (later to become the Cornell University Center for Advanced Computing);
- The John von Neumann Center at Princeton University (which was discontinued after 5 years).
Here’s an excerpt from Dubrow’s article posted this week: “These centers, which celebrated their 30th anniversaries this year, have served as cornerstones of the nation’s high-performance computing and communications strategy. They helped push the limits of advanced computing hardware and software, even as they provided supercomputer access to a broad cross-section of academic researchers, enabling the study of everything from subatomic particles to the structure of the early universe.
“In the intervening years, NSF has supported new centers and university programs — including the Texas Advanced Computing Center (TACC) at the University of Texas at Austin and the National Institute for Computational Sciences (NICS) at the University of Tennessee, Knoxville — as well as major programs at Indiana University, Purdue, Rice University and many other leading research institutions.”
Among the advances noted by Dubrow are these eleven
- 2006: Team led by University of Illinois researcher Klaus Schulten simulates an entire life form for the first time. (NCSA)
- 2007: Astrophysicist Volker Bromm and his team model the first billion years of the universe, shedding light on the cosmic past and future. (TACC)
- 2008: During Hurricane Ike, researchers use the Ranger supercomputer to develop storm surge forecasts and safeguard coastal communities. (TACC)
- 2009: Researchers use advanced computing to show how individual social security numbers can be guessed from public information on the Web. (PSC)
- 2010: Researchers aid oil spill containment effort after Deep Water Horizon explosion using satellite and supercomputing technologies. (TACC, LONI)
- 2011: Extreme Science and Engineering Discovery Environment (XSEDE) awarded $121 million by NSF to bring advanced cyberinfrastructure, digital services and expertise to the nation’s scientists and engineers. (NSF)
- 2012: University of Illinois researchers use PSC systems to show how large-scale traders used small stock purchases to game the system; discovery leads to rule changes in the NYSE and NASDAQ exchanges. (PSC)
- 2013: 3D image data enables University of South Carolina researchers to create patient-specific tissue structures. (NICS)
- 2014: A widely published global genome study using XSEDE resources and expertise shows how avian lineages diverged after the extinction of dinosaurs. (SDSC, NICS, TACC)
- 2015: Wake Forest researchers publish virtual crash test study, helping auto manufacturers design safer vehicles and restraint systems. (PSC)
- 2016: XSEDE resources from TACC and SDSC help confirm discovery of gravitational waves by Laser Interferometer Gravitational-Wave Observatory (LIGO) detectors.
Link to his Huffington Post article: http://www.huffingtonpost.com/entry/three-decades-of-making-impossible-research-possible_us_58501c95e4b0016e50430775