Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags
September 27, 2010

GPU Technology on Center Stage at GTC ’10

Nicole Hemsoth

You know you’re in for a ride when graphics giant NVIDIA kicks off a keynote by handing out stereoscopic 3D glasses. Although there was no protective eye gear and ominous sheets to protect guests from stray paint bullets this time around, there was enough hard science matched with stunning graphical displays to compete with any army of giant robots spitting out Renaissance art replicas.

At this year’s GPU Technology Conference in San Jose, which came to a close at the end of this last week, technical, scientific and parallel computing were among the top keywords. Instead of an emphasis on gaming and rendering, the focus was (check the range of sessions) far more rooted in the practical application of GPUs in production for scientific and medical research. This marks a signficant trend that’s been steadily emerging over the last three years–and one that appears to be strengthening.

From the keynote hosted by NVIDIA co-founder and vibrant CEO, Jen-Hsun Huang, to the series of demonstrations ranging from GPU-powered surgery on a beating heart and real-time remote rendering, the conference kickoff demonstrated just how transformational GPU technology is…and is becoming. Sessions included detailed descriptions of the use of (sometimes hosted, thus “cloud like”) GPU computing in oil and gas, finance, and biomedical research in addition to the more expected areas where graphics power is required. A look at the session list is worth a few moments if you’re skeptical about the increasing number of research arenas being powered by (or that will be) by the GPGPU shift.

If you weren’t able to make it to the conference, below is the beginning of Jen-Hsun Huang’s keynote where he does discuss the cloud and some of the emerging trends for GPUs as they find their way into new niches and root themselves deeper in those they’ve already inhabited for years—namely, gaming and other graphics-reliant verticals.

One of the show-stoppers during the keynote was a demonstration of cloud-based iRay and Autodesk’s 3D modeling software, 3DS Max.

The 3DS Max software, which is an integrated 3D modeling, rendering and animation tool, is the technological backbone for any number of graphic designers, engineers, and researchers in a wide range of arenas. Before the possibility of cloud servers, designers were reliant on CPUs, which meant a much longer wait. In fact, the presenters noted that the image they used for their demonstration took only a few minutes via cloud rendering versus the three or more hours this same task would have taken otherwise.

The other benefit for end users, of course, is that they can handle their rendering tasks from just about anywhere. The case they cited is that of the architect rushing off to meet a client, making last-minute changes in the car and better still, being able to show hs or her customer implementations of suggested changes within moments.

As Autodesk’s Pimentel noted, “Simulating reality is a very computationally intensive problem and the only solution we had in the past was to figure out ways to simplify the process to maximize the simulation realism. Here, we are physically simulating every photon as it passes through the scene.

The company’s striking capabilities are delivered via the 32 Fermi processors spread across a remote cluster, which allows for rendering simulations to take place in real time versus on a standard cluster. Pimentel impressed the audience by showing how a rendering could be updated on the spot, in real time with remarkable speed.

“The breakthrough is that we’re able to simulate light by tracing every photon in the scene, and in order to do that you have to model the interaction of light coming through materials like glass, light bouncing off things, semi-gloss materials, and to do this requires a massive amount of floating-point computation.” This has, of course, been one of the barriers for this software in the past because, as Kaplan stated, “there was not enough floating point processing power in CPU-driven supercomputers.”

It’s difficult to describe seeing the realistic representation of light reflecting off a glass “surface” on a rendering that changes as the lighting direction does, reflecting new aspects of the surrounding scene. Application of this in gaming and film will add an entirely new level of photorealistic perfection to a rendered image (something that was previously much less automated and required

For those who might only be familiar with NVIDIA because of their heavy presence in the world of gaming and rendering, it might come as a surprise that their scientific and technical computing initiatives are taking hold in a way that is nothing short of striking, as evidenced by the following short walk down a hallway lined with research powered by GPUs.

If there is any company capable of providing stunning eye candy, it’s certainly the Santa Clara GPU maker. NVIDIA has been straddling the line between its roots in the mainstream world of PC and console-based gaming and increasingly, in high-performance computing for science and business. With GPUs quickly gaining momentum, and in some cases knocking out traditional CPU-based systems as the preferred mode of computing, it’s no wonder that attendance was off the charts this year and filled with equal parts “cool factor” from the graphics end of their business and hard science, as evidenced by the long list of detailed sessions dedicated to exploring everything from specific topics in parallel computing to the evolution of the might GPU.

All journalistic lack of bias aside, it’s hard to deny that NVIDIA has a tendency to present the mind-boggling on a consistent basis. They’re rather hard to ignore on the graphics front, but even more so than this time last year, GPU computing is making major inroads for HPC users with more production examples emerging regularly…and some great multidisciplinary research on the horizon actually using GPUs rather than simply testing them. It’s only been a few years but for the believers (and they were there in droves) this marks a disruptive shift for technical computing.