Visit additional Tabor Communication Publications
May 14, 2012
While Thomas Sterling’s interview about the impossibility of reaching zettaflops made a lot of sense, the history of making negative predictions about technology is often an embarrassing one. Here are three examples:
"I think there is a world market for maybe five computers."
Thomas Watson, chairman of IBM, 1943.
"There is no reason anyone would want a computer in their home."
Ken Olson, president, chairman and founder of Digital Equipment Corp., 1977.
“Next Christmas the iPod will be dead, finished, gone, kaput.”
Sir Alan Sugar, British entrepreneur, 2005
If we wind back the clock to the days of megaflops, there were no commodity microprocessors (i.e,. the killer micros that put paid to many proprietary architectures), there were no multicore processor. Indeed the Cray-1 was a single processor machine. There was no OpenMP, no MPI and compute accelerators were the size of a fridge and cost tens of thousands of dollars.
Who would have thought that today’s HPC systems would use compute accelerators the size of a paperback book that were millions of times more powerful and cost a small fraction of the price? And I’ve lost count of how many times I’ve been told that the next generation of microprocessors will be the last major advance as the photolithography techniques used to manufacture chips had reached a limit, beyond which decreasing the size of devices was impossible. The industry has achieved the impossible before, and will do so again.
Moore’s Law, which states that the number of transistors placed on an integrated circuit would double every two years, is often understood to mean that performance will double every two years (some say 18 months). What started life as an observation, has become the target that marketing men guarantee and engineering budgets are set against. And the straight line graphs that technologists use to predict the future suggest that zettaflops systems will be built around the year 2030.
Professor Sterling pioneered the use of compute clusters and is a Gordon Bell Prize winner. He has excellent credentials in HPC, and I can’t refute a single fact that he put forward in his interview -- indeed, I am generally in full agreement with insights on the issues the industry faces -- but I am certain that he is wrong in his conclusion.
Arthur C. Clarke, the science fiction writer, identified what he called the "three laws of prediction," reflecting an optimistic view of ingenuity:
I have no idea what a zettaflops system will look like, but it will be magic.
About the author
John Barr covers IT early adoption and innovation in High Performance Computing at 451 Research. He is also responsible for the company's research activities within the European Commission Framework Program. John has over 30 years of experience in the IT industry, initially writing compilers and development tools for High Performance Computing platforms. The bulk of his career has been spent in a variety of technical roles at HPC systems vendors, delivering training, running benchmarks and providing pre and post sales customer support. John's core technical skill is application performance analysis, optimization and parallelization.
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.