How to Increase the Reliability of Your IT Infrastructure Using Predictive Analytics REGISTER >
Virtualization means computers no longer have to be dedicated to a particular task. Applications and users can share computing resources, remaining blissfully unaware that they are doing so. Companies can shift computing resources around to meet demand at a given time, and get by with less infrastructure overall.
Before Qualcomm Inc., a $5.7 billion telecom equipment maker based in San Diego, turned to virtualization, "We were going out and buying servers, and it might take six weeks to have them up and running. We were spending a lot on hardware," says Norm Fjeldheim, senior vice president and CIO. "Many of our engineering applications wanted to live in their own specially configured Windows environment. Development teams didn't want to share servers because they'd risk creating an unstable environment for their own applications."
Qualcomm began using server virtualization software from VMware Inc., an EMC Corp.-owned company, in 2003, and the change has been "wildly successful," says Fjeldheim. "We can create a virtual environment [for engineering applications] in 30 minutes, versus the six weeks it took to wait for and deploy a new piece of hardware. The engineers love it. They don't even realize they are sharing resources with other groups."
About half of Qualcomm's Windows servers, and a quarter of its Linux servers, are now virtual. The company has saved $1.4 million in hardware costs since it first began using VMware, and it has also significantly reduced the space needed to house all that hardware. Qualcomm has consolidated servers by a 30:1 ratio, and improved server utilization rates to 30 percent on average, and to close to 100 percent at peak load times. Meanwhile, virtualization has allowed Fjeldheim to improve utilization rates on the company's storage systems from 30 percent to 65 percent.
As with most proponents of virtualization software, Fjeldheim says the benefits extend well beyond improving utilization rates and cutting costs. "We have better uptime as a result of using virtualization. It has allowed us to abstract applications away from the hardware that runs them. Now we can manipulate the hardware, upgrade it or replace it in the event of a failure, without affecting the application that uses it."
Tony Adams, an IT analyst with J.R. Simplot Co., a $3 billion Boise, Idaho-based agribusiness, says his company has also realized many of the consolidation benefits of virtualization software. J.R. Simplot has virtualized 50 to 60 percent of its 400 servers, according to Adams, and has significantly reduced its inventory of servers. It now runs 20 to 25 virtual servers on every physical server.
But the move has had an added benefit: the development of a "tailor-made disaster recovery plan that depends on our virtualization infrastructure." J.R. Simplot has two data centers in Boise. In the event of a disaster at the primary center, the IT group, says Adams, "can bring up the virtual machines on target hardware at the second location and continue operations," albeit with a slightly degraded level of service.
According to Adams, virtualization "has completely changed the way we budget for projects. [In the past], we had to budget the price of each project to cover the cost of its own hardware. Standard practice was to give every application its own server. It was more a risk-mitigation strategy than anything else. We lacked trust in the Microsoft platform. If we needed to upgrade a server, then we could not subject all of the applications running on it to that risk. Lots of companies have hundreds of servers when a few dozen would do."