Power consumption per computational unit has dropped in the past six years by 80%, but at-the-plug consumption has still risen by a factor of 3.4, according to the Uptime Institute.

Antone Gonsalves, Contributor

August 2, 2007

3 Min Read

While vendors paint a rosy picture of improving power-to-performance ratios in servers, the reality is power usage in data centers is increasing, an IT education and consulting firm said Thursday.

In a white paper entitled "The Invisible Crisis in the Data Center: The Economic Meltdown of Moore's Law," Uptime Institute spells out why data centers are using more power at the plug, not less, as technology becomes more energy efficient.

Part of Uptime's argument is made in mathematical terms. While server compute performance has increased by a factor of three every two years since 2000, energy efficiency has only doubled during the same time period. As a result from 2000 to 2006, server compute performance has risen by a factor of 27, but energy efficiency is only up by a factor of eight.

Therefore, power consumption per computational unit has dropped in the six years by 80%, but at-the-plug consumption has still risen by a factor of 3.4. Contributing to the problem are processor manufacturers, such as Intel, Advanced Micro Devices, and IBM, packing an increasing number of power-hungry chips into the same-size hardware, which generates more heat that requires increased cooling.

Virtualization that enables IT staff to consolidate more server software in a single box can cut power and free up data center capacity. The technology, however, only delays the inevitable. "After virtualization has taken some of the slack out of underemployed IT hardware, the trend in power growth will resume," Uptime said.

When buying new data center servers, Uptime advises organizations to carefully consider the total cost of ownership. Current trends in power consumption indicate that by 2009 the cost to power a server over three years will exceed the cost of the hardware.

Put another way, by 2012, $1 million spent on servers will add $6.54 million more to the total cost of ownership of data center infrastructure than what was required in buying $1 million worth of servers in 2000, Uptime said. The reason is higher power consumption, cooling costs, and other factors.

As a simple, but effective, way to project future site total cost of ownership, Uptime suggest developing and maintaining a chart of hardware power consumption at the plug divided by the actual price paid for the hardware in thousands of dollars. The firm recommends organizations include site TCO when figuring the cost per compute unit.

In the long term, power consumption won't go down in real terms until the rate of energy efficiency equals or exceeds the rate of increase of computational performance. Hardware capable of those metrics could take as long as 10 years to reach the market.

In the meantime, organizations can take the following steps to reduce energy consumption by as much as 50%, Uptime said. Use server virtualization, enable server power-save features, turn off servers no longer in use, prune bloated software, and improve site infrastructure energy efficiency ratios.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights