For the next-generation of high-end supercomputers, speed may no longer be the most important attribute. For machines such as the upcoming Blue Waters system, which will be installed at the University of Illinois, success will involve more nuanced designations than “the world’s fastest computer.” An article at the Chronicle of Higher Education, penned by Jeffrey R. Young, asserts that “flat-out speed, for a long time the measure of a supercomputer’s worth, may be going out of style.” In its stead we’ll likely see a growing emphasis on sophisticated software and innovative architecture.
Young explains that speed is an enticing goal because it attracts federal dollars. There’s about $1.6 billion of government money available for high-end computing projects and the debate about what makes a supercomputer “super” affects who gets the money. There’s a growing consensus in the HPC community cautioning against using “speed” as the main arbitor of value. With multicore transitioning into manycore, the main challenge will be developing code that can straddle all those cores, but software design is not where the glory is. As Thom Dunning, Director of the National Center for Supercomputing Applications, notes: “Every congressman loves to sign his name to the latest, greatest machine. That’s the photo op. You don’t get the same photo ops with software.”
Support is growing for this “work smarter, not faster” approach. A December report from the President’s Council of Advisors on Science and Technology “calls for a more balanced portfolio of U.S. supercomputing development, and warns against an overemphasis on speed rankings like the Top 500 list.” This distinguished panel of experts conclude that an arms race based soley on speed diverts attention and resources from valuable scientific endeavors.
Young drives home the point with this automotive analogy: “Think, say advocates, of the folly of a best-car list based only on top speed. So what if a Ferrari is faster than a Volvo station wagon when you have to take two kids to soccer practice?”
While the TOP500 list still reigns as the world’s bellwether supercomputing standard, it’s no longer the only game in town. Over the past few years, other metrics have sprung up, including the Green500, the HPC challenge benchmark, and the Graph500. And despite the ubiquitous use of the Linpack benchmark, there’s growing awareness of its limitations. Jack Dongarra, director of the Innovative Computing Laboratory at the University of Tennessee at Knoxville and one of the list’s creators, criticizes its usefulness since it reveals only one measure of performance, how quickly a computer can solve a series of algebraic equations. ”These computers are complicated, and they have many facets, and we should evaluate the different components that go into the systems,” Dongarra adds.