Accelerated computing certainly dominated the IBM message at SC15 but there were many sub-themes in Austin. Big data, the beneficial impact of software frameworks (think Apache Spark), workflow optimization, and a growing role for cloud in HPC delivery were all in the mix.
HPCwire managing editor John Russell sat down with Dave Turek, IBM VP of High Performance Computing, for a fast tour of CORAL progress, NSCI hopes, and the distinction between accelerator-assisted and accelerated computing, and other topics now under the IBM magnifying glass.
On the rise of big data and its inexorable push into HPC, Turek noted the more data you have, the richer the insight you can gain from analysis. “Our strategy is centered on that and we are making material changes to our underlying architecture to accommodate these kinds of data flows. Absent [doing] that, simulation and modeling stands as an island in isolation and the value is quite diminished.”
The essence of HPC is evolving, said Turek, and the new imperative is to accommodate big data. New programming models, frameworks a la Spark, and better-designed systems from IBM and its systems maker brethren are all necessary.