April 16, 2011

HP Rep Dishes on the Exascale Datacenter

Tiffany Trader

On the NextBigFuture blog, Sander Olson interviews Partha Ranganathan, Hewlett Packard’s principal investigator for HP’s exascale datacenter project, on a host of issues relevant to the HPC industry. Ranganathan’s frontline commentary covers a wide range of topics, from datacenter design to cloud computing to the challenges of exascale computing. The same design challenges that affect top-level supercomputers also affect modern datacenters. According to the HP rep, creating a feasible path for exascale computing in supercomputers and datacenters will require serious efficiency improvements.

Part of the big push to increase computing power, including expanding into the cloud, comes from the need to examine and dissect an ever-increasing data flow. This trend has led Ranganathan to assert that the “industry is transitioning from the information era to the insights era.” Ranganathan explains that the ability to derive “useful information from the deluge of digital data…will provide valuable insights into many things.” He sees this as the “killer app” for cloud computing.

On the subject of exascale computing, Ranganathan describes the technical challenges as daunting, and explains that HP created the exascale datacenter project to help find solutions. Specifically, Ranganathan sees the power issue as the biggest showstopper. He cites the statistic that the carbon footprint of the world’s datacenters is comparable to that of the aviation industry. To address the “power wall,” designers will need to create systems that are 10-100 times more power efficienct. Servers, which reside at the center of any datacenter, will need to become more efficient memory-wise. On this point, Ranganathan believes memristors could hold the key:

Memristors are two-terminal components developed by HP Labs, the company’s central research arm, that could serve either as memory or as logic. Memristors could be used to combine the logic and the memory in a single area, instead of having memory in a separate section. By combining logic and memory, we could essentially eliminate the memory bottleneck which plagues current computer architectures.

Magnetic hard drives won’t disappear. They will simply go to the next level of storage, which is archival. The cost-per-bit for magnetic hard drives will probably always be lower than for a comparable solid-state-drive. But by eliminating spinning disks from most mainstream usage, we can simultaneously reduce power consumption and increase overall performance.

Looking forward into the next decade, Ranganathan envisions a paradigm shift in the way computers and datacenters operate. In this future scenario, data and computing will be alligned by “intelligent disks,” and ”the data will be able to examine itself and derive insights.” He explains this new model will be more efficient than current memory techniques and will change the way that datacenters relate to data. It’s a more organic approach, “similar to the way the human brain works,” Ranganathan adds.

Full story at Next Big Future

Share This