FutureBy Karen S. Henrie
Virtualization Can Save Departments, Not Just Servers
Over the past 20 years, companies have amassed a glut of IT infrastructure and put up with levels of waste and inefficiency that they would never tolerate if IT were seen in the same light as more traditional corporate assets. Imagine an auto manufacturer building a plant that gets used for two hours a day. An oil company investing in a refinery that produces at just 5 percent of capacity. A bank with 20 five-story office buildings and only 15 employees working in each one.
Yet that sort of inefficiency is rampant in IT. According to Gartner Inc., a utilization rate of from 5 percent to 10 percent on Intel servers is the rule rather than the exception. Most IT organizations run out and buy a new server every time they deploy a new application.
CIOs have willingly tolerated this situation in hopes of making sure they can supply enough IT resources to business users when needed. Many CIOs are distrustful of mixing and matching carefully configured applications on a single platform, for fear it will put one or all of those applications at higher risk of a system-level failure. The operating-system upgrade required by an accounting application could bring down a payroll application running on the same server.
Steadily declining equipment costs have only made matters worse. Processing power, memory and disk space are getting cheaper by the day. Server prices alone have dropped 80 percent or more over the past decade, making it that much easier to simply buy another server or storage device, rather than rationalize an existing setup.
Bill Homa, senior vice president and CIO at Hannaford Brothers Co., a $5 billion grocery chain based in Scarborough, Me., recalls a store-based labor-scheduling system that ran on 33 Windows NT servers. Some ran the application in production, while others were used to develop and test newer versions. "Managing that was a nightmare," says Homa.
This profusion of infrastructure does not come cheap. According to a recent report by IDC, the labor required to maintain a single small application server can cost between $500 and $3,000 per month in a production environmentand that figure excludes costs associated with backup and recovery, network connectivity, power and air conditioning. Multiply that by hundreds, or even thousands of servers in the typical large IT organization, and it's easy to see how systems-management costs are skyrocketing.
The problem isn't confined to servers. Many storage systems, networks, PCs and even applications are also vastly underutilized. According to a recent Gartner survey, companies routinely spend 70 percent to 80 percent of their total IT budgets supporting established applications and required infrastructure components.
That's why a growing number of IT organizations are now turning to virtualization. The term "virtual" goes back to the days of mainframe time-sharing, when it referred to partitioning and other technological sleights of hand that allowed one computer to carve up its processing, memory and storage resources so that it appeared as a different computer to every user.
Present-day virtualization stems from those early roots. Dan Kusnetzky, vice president of System Software at IDC, says virtualization is akin to "creating and fostering a carefully designed illusion." Gartner defines it more prosaically as "the pooling of IT resources [e.g., computing power, memory, storage capacity] in a way that masks the physical nature and boundaries of those resources from the user."
Virtualization means computers no longer have to be dedicated to a particular task. Applications and users can share computing resources, remaining blissfully unaware that they are doing so. Companies can shift computing resources around to meet demand at a given time, and get by with less infrastructure overall.
Before Qualcomm Inc., a $5.7 billion telecom equipment maker based in San Diego, turned to virtualization, "We were going out and buying servers, and it might take six weeks to have them up and running. We were spending a lot on hardware," says Norm Fjeldheim, senior vice president and CIO. "Many of our engineering applications wanted to live in their own specially configured Windows environment. Development teams didn't want to share servers because they'd risk creating an unstable environment for their own applications."
Qualcomm began using server virtualization software from VMware Inc., an EMC Corp.-owned company, in 2003, and the change has been "wildly successful," says Fjeldheim. "We can create a virtual environment [for engineering applications] in 30 minutes, versus the six weeks it took to wait for and deploy a new piece of hardware. The engineers love it. They don't even realize they are sharing resources with other groups."
About half of Qualcomm's Windows servers, and a quarter of its Linux servers, are now virtual. The company has saved $1.4 million in hardware costs since it first began using VMware, and it has also significantly reduced the space needed to house all that hardware. Qualcomm has consolidated servers by a 30:1 ratio, and improved server utilization rates to 30 percent on average, and to close to 100 percent at peak load times. Meanwhile, virtualization has allowed Fjeldheim to improve utilization rates on the company's storage systems from 30 percent to 65 percent.
As with most proponents of virtualization software, Fjeldheim says the benefits extend well beyond improving utilization rates and cutting costs. "We have better uptime as a result of using virtualization. It has allowed us to abstract applications away from the hardware that runs them. Now we can manipulate the hardware, upgrade it or replace it in the event of a failure, without affecting the application that uses it."
Tony Adams, an IT analyst with J.R. Simplot Co., a $3 billion Boise, Idaho-based agribusiness, says his company has also realized many of the consolidation benefits of virtualization software. J.R. Simplot has virtualized 50 to 60 percent of its 400 servers, according to Adams, and has significantly reduced its inventory of servers. It now runs 20 to 25 virtual servers on every physical server.
But the move has had an added benefit: the development of a "tailor-made disaster recovery plan that depends on our virtualization infrastructure." J.R. Simplot has two data centers in Boise. In the event of a disaster at the primary center, the IT group, says Adams, "can bring up the virtual machines on target hardware at the second location and continue operations," albeit with a slightly degraded level of service.
According to Adams, virtualization "has completely changed the way we budget for projects. [In the past], we had to budget the price of each project to cover the cost of its own hardware. Standard practice was to give every application its own server. It was more a risk-mitigation strategy than anything else. We lacked trust in the Microsoft platform. If we needed to upgrade a server, then we could not subject all of the applications running on it to that risk. Lots of companies have hundreds of servers when a few dozen would do."
Virtualization won't solve all your problems. The virtual servers you create are still IT assets that need to be administered and managed, as Qualcomm's Fjeldheim is quick to point out. Frank Gillett, a principal analyst with Forrester Inc., agrees. "If you put ten virtual servers on a single machine, you eliminate nine pieces of hardware, but you still have ten operating systems to maintain." John Hinkle, vice president and CIO of Trans World Entertainment Corp., has also found that "managing the flexibility [provided by virtualization] is another skill set that we needed to develop and train people in."
Over time, the various virtualization products currently needed to handle virtualization for storage, servers, networks and other infrastructure elements are likely to converge into a single administration tool. Products such as IBM Corp.'s newly released Virtualization Engine 2.0 are meant to make it easier to virtualize across technology layers and manage it all through a single console. Still, getting all those virtualization products to work together remains difficult.
Meanwhile, certain computing tasks will always demand a dedicated machine: "If you have to squeeze the most performance out of a piece of hardware, then you still need the physical hardware," says Adams at J.R. Simplot.
Perhaps the biggest issue IT organizations face, according to a recent report from Gartner, is that there is no "defined and acceptable set of metrics and pricing mechanisms to support virtualized, on-demand services. Almost everything in the IT infrastructure industry has a cost or price associated with a physical box." It will take years for IT vendors to overhaul their pricing to reflect the flexibility and efficiency inherent in virtualization and related technologies. And IT organizations will likewise have to rethink how they charge their own customers for the IT resources they consume.
Among the virtues of virtualization is that the technology dovetails nicely with other, newly minted IT strategies that have captured the attention of CIOs. Fjeldheim sees a common thread running through virtualization, utility computing, grid computing and service-oriented architecture. "Everyone is starting to live in a shared world," he says. "All of these things are ways to provide better service to the customer, because you are no longer dependent on a single point of failure, and so you can bring costs down and add or take away resources as needed."
Rich Lechner, vice president of virtualization at IBM, thinks companies have just begun to see the benefits of virtualization, which he says will "insulate users from underlying infrastructure, and open the door for utility computing."
Mark Stahlman, managing director at Caris & Co., a New York City-based investment bank, believes that virtualization will allow for "the radical simplification" of computer and networking systems, and that it is kick-starting the next "fundamental shift" in the way computer and networking systems are used. That shift, he says, will be similar in scope and investment to the move to decentralized, networked computing that took place in the 1980s and 1990s. "Building virtual 'clouds' of computing, storage, networking and applications resourcespermitting flexible deployment and rapid growth, as well as per-usage paymentsis becoming the Holy Grail of the IT industry," Stahlman says.
Meanwhile, cheaper, faster and more reliable communications networks are forcing companies to rethink the distributed systems they worked so hard to build over the past two decades. Says Hannaford's Homa: "Retailers used to have very distributed systems in stores, but the world has turned upside down in the past five years. With virtualization, retailers are picking up their applications from stores and putting them back down in the corporate data center," where they can be used more efficiently and monitored more easily. Hannaford recently consolidated 300 servers that formerly had been located in stores and replaced them with a new mainframe system from IBM. The mainframe already has 62 virtual servers running some of its most strategic applications, including its Web-based vendor portal, which Hannaford's suppliers use to schedule docks for delivery, quote prices and more.
Virtualization also gives CIOs some wiggle room when it comes to planning for the future. "In the retail industry, the hardest thing is knowing where we are headed from the standpoint of size and growth," Homa says. "Virtualization allows me to not be all-knowing right now. It allows me to get it slightly wrong and adapt as we go. That is the greatest benefit."