Data Center Power Consumption on the Rise, Report Shows

The amount of electricity used to power the world’s data center servers doubled in a five-year span due mainly to an increase in demand for Internet services, such as music and video downloads, and telephony, according to a new report.

If current trends continue, the amount of power to run the world’s data center servers could increase by an additional 40 percent by 2010, said Jonathan Koomey, a staff scientist at Lawrence Berkeley National Laboratory in Berkeley, Calif., and a consulting professor at Stanford University.

Koomey’s report, funded by Advanced Micro Devices, the Sunnyvale, Calif., chip maker, is being presented at the at the LinuxWorld OpenSolutions Summit in New York City on Feb. 15.

Between 2000 and 2005, according to Koomey’s research, the average amount of power used to fuel servers within the data center doubled. In the United States, that represented a 14 percent annual growth in electrical use, while worldwide use increased by about 16 percent every year.

In 2005, the electrical bills for U.S. companies totaled $2.7 billion. The cost of electricity for the entire world topped $7 billion. Within the United States, the total cost of powering data center servers represented about 0.6 percent of total electrical use within the country. When the additional costs of cooling and other usage is factored in, that number jumps to 1.2 percent.

“The total power demand in 2005 (including associated infrastructure) is equivalent (in capacity terms) to about five, 1000 MW [megawatt] power plants for the U.S. and 14 such plants for the world,” Koomey wrote in the report.

Click here to read more about the greening of the data center.

The study, using data from IDC, looked specifically at servers used in the world’s data centers, which represent about 60 to 80 percent of a data center’s total IT loads.

As the demand for new technology grew, the number of installed, low-end volume servers—typically systems under $25,000, which also includes blades—increased. This trend seems to have driven the skyrocketing energy consumption of the last five years more than the actual energy usage per server.

“Almost all of this growth is attributable to growth in the numbers of servers (particularly volume servers), with only a small percentage associated with increases in the power use per unit,” according to the report.

In the report, Koomey acknowledges that further study of data center equipment, such as data storage and networking equipment, is needed to gain a more insightful and complete view of the average cost of powering a data center.

Koomey concludes that a number of factors could change power consumption in the next several years, including the adoption of more blades in the data center, virtualization technology, and more awareness of the total cost of ownership of data center equipment.

Click here to read more about what HP is doing to improve the data center.

“The total cost of building a large data center is now on the order of $100 to $200 [million], which is sufficient to get the attention of the CEO of most large organizations,” Koomey writes.

“That visibility to corporate management is likely to drive operational and design improvements that should over time improve the Site Infrastructure Energy Efficiency Ratio and spur the adoption of energy metrics and purchasing standards for efficiency of IT equipment within these companies.”

Check out’s for the latest news, views and analysis on servers, switches and networking protocols for the enterprise and small businesses.

Get the Free Newsletter!

Subscribe to Daily Tech Insider for top news, trends, and analysis.

Latest Articles