The Wild, Wild Cost Of Data Centers

Eric Chabrow Avatar

Updated on:

Kenneth Brill sees what many others don’t: an ominous financial crisis in the making for business, stemming from energy waste in the corporate data center. Like Al Gore, who travels the world warning of an environmental catastrophe caused by global warning, Brill evangelizes on a more-targeted aspect of the environmental crisis, warning IT and business managers about the fiscal and operational consequences of the inefficient energy use of IT.

As the founder and executive director of the Uptime Institute, a member-supported data-center advisory service, Brill warns company executives that they must change the way they finance and manage data centers and their components (servers, blades, etc.) in order to save millions–and perhaps tens or hundreds of millions–of dollars.

This crisis is being fueled by the increasing divergence between rapid performance gains in server computing and far slower advances in energy efficiency. Brill characterizes this as a meltdown of Moore’s Law, while others see it as Moore’s Law run amok.

At the root of the problem is simple math. Server compute performance has been increasing by a factor of three every two years, but energy efficiency is only doubling in the same period. That widening gap means that the cost to power the data center will increase significantly year after year. And that will cost business big time.

Brill recently spoke with CIO Insight Editor Eric R. Chabrow. What follows is an edited transcript of that conversation.

CIO Insight: You’ve been warning about a coming crisis in the data center. How bad is it?

Kenneth Brill: It’s actually the IT story of the year: The success of Moore’s Law has come to a point where we’ve reached system limitations. The rate of computing has increased two to three times every 24 months, and the weight of the hardware is such that buildings have a difficult time supporting it.

Power consumption is so significant that accommodating the increase in IT power consumption over the next five years will require us to construct 10 major power plants. If current trends continue, we will need another 20 between 2010 and 2015. That’s 30 power plants that need to be built just to accommodate the growth in IT power consumption.

Why is this happening?

Brill: When you buy IT hardware, a certain number of watts are going to be consumed when you plug it in. The ratio of embedded watts per $1,000 of hardware has risen dramatically, so even if you spend the same amount of money on IT hardware, that new hardware’s increased performance brings a vast increase in power consumption. This happens invisibly for most organizations, because the power bill doesn’t go to IT.

Are you suggesting it’s coming out of nowhere?

Brill: People at the working level, such as the members of our Uptime Institute, have seen a 6 percent to 7 percent compound annual growth rate in power consumption in their data centers. This is across five million square feet of computer room space, so it’s a pretty big sample. Although 6 percent to 7 percent doesn’t sound like a lot, over five years, it increases your power consumption by almost 50 percent.

Something very dramatic happened at the end of 2005. We don’t know what caused it, but it’s abnormal, and there’s no historical precedent for it. Power consumption had been going along at a 6 percent to 7 percent compound annual growth rate, but for the top third of our members, the growth rate went up to 25 percent annualized. That means, in two years the power consumption of the top third of our members went up almost 50 percent! There’s no historical precedent for that.