The Evolving Data CenterBy Tony Kontzer | Posted 07-10-2007
The Evolving Data Center
To paraphrase Mark Twain's beaten-to-death yet still relevant quotation, rumors of the data center's death have been greatly exaggerated. For years, technology pundits have predicted that the Internet—which in many ways can be viewed as a giant virtual data center—would eventually eliminate the need for companies to keep rooms full of big, fast, powerful computers that serve up data to networks of users. Lo and behold, those fears appear to have been misguided.
In fairness, big computers aren't as prevalent as in the past—technology's adherence to Moore's Law has seen to that. But fast and powerful computers? They're still very much needed at the nerve center of any large corporation. Granted, there's a handful of companies whose data centers are growing to massive proportions—Google and Microsoft are the most obvious examples, with both tech giants serving up huge amounts of personalized data to legions of online consumers.
Michael Bell, vice president of server research for Gartner, says he can't even venture a guess as to how big Google's data center has grown, but he suspects it's responsible for a large portion of the company's $1 billion in annual electricity spending. [The New York Times last year estimated Google had at least 450,000 servers running in 25 data centers.]
But at most companies, data centers are shrinking as servers get ever smaller, costs related to powering and cooling them are rising, and tools such as virtualization software enable single servers to act like clusters. Yet as they get smaller, corporate data centers are becoming more important to their respective businesses. There are a plethora of reasons for this, from the continued growth of electronic business to the growing need to support mobile computing to the increasing complexity of the applications data centers support. Throw in an additional consideration Bell believes is critical—namely, the transformation of applications from locally run clients into on-demand services delivered to dumb devices—and the data center's place in the IT lineage appears safe for some time to come.
There also is a socio-environmental consideration: In an era when the fight against global warming has become a mainstream cause, some CIOs believe it's their duty to cut back on the power they use to run and cool all of their data center machines. But there is far from across-the-board agreement, with many CIOs saying they're way too busy supporting their company's core businesses to be bothered with environmental do-goodism.
That debate aside (we'll get back to it later), an indisputable truth has arisen: It's time to rethink the data center. Sun Microsystems CEO Jonathan Schwartz suggested as much in a well-worded blog entry last October, concluding with this ominous thought: "Why bother with data centers at all?" In response, companies across the U.S. and around the world are taking steps, practical and creative, to transform their data centers into operations befitting their changing business environments, and in some cases, the world's changing relationship with its resources.
Doing More With Less
Doing More With Less
Over the next two years, global advertising firm Ogilvy Worldwide plans to shrink its main New York data center to just 500 square feet from the 5,000 square feet it occupies, CIO Atefeh Riazi says. In the process, it will cut its shrinking inventory of servers from 1,500 to 750, but end up with fuller racks. It will look to deliver more business applications as on-demand services, shifting the need for computing power from the desktop to the data center. And it will make more use of virtualization, helping to reduce the number of e-mail servers it relies on from 80 to three.
"Our data center is going to be smaller in terms of space, but we'll have much more capacity than we did before," says Riazi. "It's an easier environment to manage. It's almost a utility. We know we need to start focusing on applications and services, and not so much on servers and storage and the stuff we've been managing for a long time."
Welcome to the new reality confronting these once oversized rooms where workers watched over large machines whose main job was to store and serve up data to employees.
Financial giant Deloitte & Touche is on a similar path. The company has shut down 100 redundant applications in the past 18 months, including consolidating more than a dozen customer-management systems and eight performance-management systems into one of each, U.S. CIO Larry Quinlan says. It has slashed the number of voicemail systems scattered throughout its network from 80 to two, and has centralized its application servers, shifting the core of its e-mail and fax applications to its main data center in Nashville, Tenn. It has also moved 300 file servers from its field offices to the data center. The cumulative impact has eased the need for the server rooms that have functioned as local data centers for many of the company's 85 domestic offices.
To help it manage those changes, Deloitte & Touche has, like so many others, made blade server technology and server-virtualization software major components of its IT strategy, redefining the role the data center plays from here on.
"It's an evolution, and we are moving toward fewer data centers and fewer places where there are important processing capabilities," Quinlan says. "We really are moving the data center to a totally indispensable kind of environment, something you couldn't possibly do without."
As a result, IT executives like Riazi and Quinlan have had to significantly alter the way they view and manage their data center assets, and they should expect more changes as technology evolves.
"They need education to solve the problems that they're facing, and that they're going to be facing," says Tom Roberts, director of data center services for Trinity Health, which operates 17 hospitals in seven Midwestern states, and a board member of the AFCOM [Association for Computer Operations Management] Data Center Institute, a think tank devoted to data center-related issues. "Over the past two or three years, the technology has changed so fast, things have gotten so fast and so hot, and so power-consuming, that no one has had a chance to react."
Consider that in the past, every watt of power devoted to computer processing in data centers required a half-watt of power for cooling and lighting, and today that equation has flipped: Every watt consumed by computing resources now requires two watts of power for cooling and lighting, according to Gartner's Bell. "That means one-third of power is going to useful work, and two-thirds is devoted to non-productive tasks," he says.
That fast-changing equation is forcing IT executives to think more conscientiously about their data centers on multiple fronts. They must keep up with technology to ensure they have the computing resources to support today's complex business applications; they must squeeze more productivity out of their servers so as not to let those machines sit idly, as they often did in the past; and they must control the increased costs of powering and cooling those resources, which often requires them to reduce the size of their data centers to cut down on waste.
Often, however, solving one problem causes another. A few years ago, brokerage firm Thomas Weisel Partners reached the point where it was running out of data center space. It tackled the problem with a combination of blade servers, which pack more computing power into smaller boxes, and virtualization software, which lets individual servers handle the workload of multiple machines. Virtualization had the added benefit of letting the company run multiple applications on a single server without worrying about intermingling customer data, a huge concern for financial services companies.
But the combination of smaller, hotter-running machines and virtualization led to heavier power consumption and cooling requirements. "I'm probably more likely to run out of power and cooling before I run out of space," says Kevin Fiore, Thomas Weisel's vice president and director of engineering services.
The question of how to reduce power drain while beefing up processing capacity has proven to be a conundrum that keeps many IT executives awake at night. So it's no wonder technology vendors devote big money to reducing the data center power drain.
Intel and AMD are investing millions in their latest efforts to offer microprocessors that deliver more processing capabilities while consuming less power—Intel with its research into new (as yet unnamed) input/output technology that could support up to 10 processors on a single chip, and AMD with its soon-to-be released Barcelona chip. IBM's billion-dollar "Project Big Green" initiative, intended to make computing more energy efficient and environmentally friendly, includes a five-step program for companies looking to cut power use in the data center. And a team at Hewlett-Packard Labs last year introduced "dynamic smart cooling" technology that links smart air conditioning systems to a network of sensors measuring temperatures entering and leaving servers, in theory delivering cooling only when and where it's needed. "Cooling beyond needed levels is a waste of energy," says HP Fellow Chandrakant Patel, who heads up the effort. "We can reduce power consumption by 25 to 45 percent."
Meanwhile, Schwartz and his executive team at Sun—which last fall introduced Project Blackbox, a self-contained, shippable data center—have established power consumption as a major area of engineering focus. In a blog entry posted last September, CTO Greg Papadopoulos made it clear that processing power is no longer the most important consideration in equipping data centers. "Just about every customer I speak with today has some sort of physical computing issue: They are maxed out on space, cooling capacity or power needs—and frequently all three," Papadopoulos wrote. "My guess is that we'll look back at today's 'modern' systems to be about as efficient and ecologically responsible as we would now view the first coal-fired steam locomotives."
An IT Problem
An IT Problem?
If it sounds like this all veers beyond IT's traditional purview, Ogilvy's Riazi says it's about time. Data center responsibilities historically have been split; facilities managed the physical space, while the CIO managed the operations within that space. But that division of duties has become antiquated. "There are certain conversations that need to take place between IT and facilities about how we manage these facilities more efficiently," she says. "I think that relationship needs to get much tighter."
That need to bridge the gap between IT and facilities was key in the design of HP Labs' dynamic smart cooling technology. In the past, Patel says, data center managers might get admonished for bringing down servers, but no one ever got punished for wasting power. Today, it's possible to get into trouble for either, a change he says shows the need for IT and facilities to develop an integrated view of the data center.
Patel's team at HP is attempting to deliver that view by building a data center architecture that will monitor the entire data center, from the chip level up to the cooling system. In theory, a data center employee could allot power and cooling on a granular level and schedule workloads to make the most out of server resources while minimizing the strain on the power and cooling systems. Gartner's Bell says that's the right approach, and one he expects to be widely available in a few years: "We'll see a more universal management system that connects with the power structure and manages infrastructure and workload in concert, and in real time." But whether IT executives will want to invest in such granular control is debatable. Count Thomas Weisel's Fiore among those who desire such tools. "I'd like to direct my air conditioning to where I need it most," he says.
Conversely, Deloitte & Touche's Quinlan says he can't justify devoting resources to something that isn't a core competency. "We want to run a good data center, but we don't want to get sophisticated about cooling," he says. "We don't have time. We want to focus on services for our clients. If I get to the point where I have to worry about those things on such a minute level, it's time for me to think about outsourcing my data center to people who do that kind of stuff for a living."
Roberts, the Data Center Institute board member, says most of the IT executives he talks to fall in Quinlan's camp. Advances such as HP's cooling technology may be highly effective, he says, but in most cases, CIOs simply won't see them as practical. "Most folks have a huge investment in their equipment already. They're probably not going to make additional investments in the next three to five years." But Ogilvy's Riazi sees that perspective as myopic. She says IT executives have to see beyond their mission statements and bottom lines, and start to think as global citizens.
"It's an irresponsible position to take, and I expect more from CIOs," she says. "Having an operation that reduces carbon emissions is the responsibility of every CIO." To that end, Riazi says Ogilvy started tracking the energy consumption and carbon emissions of its servers this year, with an eye toward making its data centers as "green" as possible in the future. This puts Riazi on the cutting edge when it comes to the greening of data centers. Most CIOs can't specify how much power their data centers consume, and thus can't establish how much they waste. But that's likely to change as the end-to-end management tools Gartner's Bell foresees becoming commonplace let IT executives track the whereabouts of every last watt. Those who've embraced all the emerging data center tricks at their disposal will like what they see. Those who haven't may just find themselves in trouble with their bosses—whether it's for servers going down or for power wasted. If such ominous prospects don't send CIOs scrambling for answers, nothing will.