Virtualization: Agility More Significant Than Cost Cutting

Although much of the excitement about virtualization has focused on the cost-cutting it enables, the real value comes from business agility, according to participants at the Ziff Davis Enterprise 2007 Virtualization Summit.

“When management says, ‘Can IT do this?’ you want the answer to be yes,” said Barb Goldworm, founder, president, and chief analyst of Focus Consulting, in her keynote address. By creating a virtual pool of processing and storage resources that can be quickly allocated to new applications and more easily moved from primary to backup locations, virtualization can make an IT organization more agile, resilient and responsive, she said. That theme was echoed by the CIOs and consultants who participated in the event.

Michael Cammack, CIO at Bowman & Brooke, says his firm has a tradition of creating a lot of pilot applications that might be of use to its clients, who are primarily large manufacturers. “We have got to stay leading edge so that when they ask, we can say, ‘Yes, we can do that sort of thing,” Cammack said.

Still, because those applications don’t get heavily used until a client has a real need for them, he had a large number of very lightly used servers that were perfect candidates for his first virtualization project, which moved the applications running on 100 servers onto 6 servers running VMware.

Next, Bowman & Brooke moved to a more wholesale virtualized architecture to capitalize on using virtualization not just for consolidation but for the virtues of “redundancy, resiliency and recovery,” Commack said. By allowing him to replicate or move applications between machines and data centers, virtualization is helping him improve system availability.

Although saving money wasn’t the initial motivation for the law firm’s virtualization effort, Commack says it proved a godsend when he was asked to cut his budget late last year. “We were asked to reduce the overall budget 20 percent. We submitted a budget that was 30 percent less, and now it looks like we’ll come in 40 percent under.”

Virtualization wasn’t the only factor in those savings, but it was a big one. “Yes, the savings are there, but it’s really the agility that matters,” said Margaret Lewis, director of commercial independent software vendor (ISV) marketing at chipmaker AMD.

Lewis said AMD has successfully achieved a ratio of about 30 virtual machines to one physical server. Many of AMD’s business systems now run on a complex of nine active hosts and two “swing” servers, which are used to cover contingencies such as taking over the workload from another server that requires maintenance.

The savings in data center power was even greater than AMD had anticipated, she said. “We saw a 79 percent reduction in power consumption in the first year.”

Although the term “virtualization” has taken on many meanings, the seminar primarily focused on techniques for allowing multiple, logically isolated operating system instances and associated applications to run on a single physical server. Although this technique has been used for decades on mainframe computers, VMware made its name by allowing multiple instances of Windows—or a mix of Windows and other operating systems—to run on a single computer. Microsoft has since entered the virtualization market (and is likely to become a more serious contender with a Windows Server 2008 version of its technology expected next year). XenSource—a virtualization software company acquired by Citrix—and a number of other contenders are basing their offerings on the open-source Xen virtualization engine.

To date, most of the success stories have revolved around companies identifying underutilized servers in their data centers and consolidating them onto a much smaller number of physical machines. The virtual server software allows each application to run as if it were on a separate server, preventing conflicts between different applications.

No new technology comes without complications, however. Server virtualization technology has become popular partly because it helps combat the server sprawl—particularly the proliferation of underutilized, single-function Windows servers—analysts and practitioners are starting talk about another problem: virtual server sprawl.

The worry is that taking away the traditional, practical obstacles to procuring and setting up a new server, and replacing it with a process of activating a new virtual machine that can be accomplished in minutes, may make things just a little too easy. In other words, you could wind up with a lot more virtual machines than you actually need because the creation of these new “servers” is seen as free. And virtual or not, they still have to be managed and tracked, and they still consume computing resources.

Virtual server sprawl is just one example of the management issues are starting to emerge as virtualization becomes more mainstream, and most participants in the seminar saw it as more of a potential problem to watch out for than one they were experiencing.

Doug Splinter, chief technology officer at the consulting firm Convergent Solutions Group, said his clients are starting to experience virtual server sprawl. In particular, heavy users of virtualization can easily wind up with servers that were set up for testing purposes or some other passing need continuing to run long after they are no longer needed.

Although part of the answer may come from systems management tools that are better attuned to the virtual environment, the consensus among seminar participants was that it’s mostly a matter of making sure the IT organization has processes for provisioning, managing, and securing virtual machines that are at least as good as the process for setting up physical servers.

In other words, the issues aren’t really specific to virtualization. As Goldworm put it, “Anything can be used badly—we have a lot of experience with that in IT”

Get the Free Newsletter!

Subscribe to Daily Tech Insider for top news, trends, and analysis.

Latest Articles