Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags
February 21, 2014

Simulating HPC Workload Energy Costs

Carlo del Mundo
object_storage

In today’s computing world, energy- and power-efficiency in data centers is critical to reducing a system’s overall total cost of ownership. Energy-efficiency is so important that the energy cost of operating a datacenter far exceeds its initial capital investment.

In addition to cost savings, improvements in energy-efficiency also translate into lower carbon emissions. As Omar Al-Saadoon, System Specialist at EXPEC Computer Center puts it, “one megawatt generates close to 8,000 metric tons of C02 per year when burning petroleum to produce electricity.” In short, it behooves data center specialists to educate themselves on the energy-efficiency (or inefficiency) of their system to save on operating costs and on the environment.

To promote an energy- and green-conscious way of computing, Al-Saadoon and his team have developed a simulation framework that provides an intuitive view of the energy costs of a workload. His goal is to “empower simulation engineers to better assess the environmental effect of their simulation runs and become green-conscious.”

His team collected the energy and power characteristics of their Hydrocarbon Reservoir simulation on 1024 compute nodes over the course of three months. Overall, they provide insight on the energy usage of datacenter level computers and incorporates environmental metrics such as carbon emission on a per job basis.

Energy usage in data centers is bifurcated into two groups: servers and support infrastructure. Servers are the physical computing systems that perform computation. The supporting infrastructure includes components such as cooling, lighting, UPS batteries, interconnects, and AC/DC conversation. Typically, the supporting infrastructure adds a significant component to the overall costs of running a data center. In fact, data center efficiency is measured with both the physical computing system and supporting infrastructure using a metric called Power Usage Efficiency (PUE).

PUE is a combination of server and infrastructure power. A PUE value of 2 means that for every 1 kWh of server power, another 1 kWh is spent on cooling, lighting, and other infrastructure needs. The most efficient PUE value tends towards 1 — suggesting the ideal environment that little to no power is used for infrastructure needs.

 

Tags: