PC Virtualization: What You Need to Know

By Jennifer Zaino  |  Posted 05-23-2007

PC Virtualization: What You Need to Know


Applying virtualization technologies to corporate PC infrastructures can offer enterprises a way out of the unmanaged PC mess; help them cut upgrade, support and maintenance costs; eliminate application conflicts; improve business continuity and data security; speed application development and rollouts; and provide workers anytime/anywhere access to their desktops.

What if you could make everything you hate about your corporate PCs go away? The impossibility of managing them in the face of users' unauthorized application and MP3 downloads; the three-year refresh cycles; the endless testing to ensure that applications will play nicely together; the concerns over critical corporate data residing unsecured on employees' notebooks; the loss of productivity when systems take a nosedive?

Businesses are taking a closer look at how desktop and application virtualization technologies can have an impact on all these areas. And none too soon: Support and maintenance accounts for 80 percent of PCs' total cost of ownership, experts say, and 5 percent of systems experience a hardware failure in their first year on the job.

Still, CIOs may face one or more challenges: End users may object to changing out PCs for thin clients; IT may have to sort out Windows usage issues, given the one-to-one relationship between hardware and OEM Windows licenses; and infrastructure teams will have to make sure server platforms are robust enough to run multiple virtual PCs and that network bandwidth is managed to mitigate latency and quality of service problems. But given the promises of less expense and complexity and more agility and adaptability, these technologies represent the future for a growing number of companies.

Desktop and application virtualization deployments will grow by 28 percent and 24 percent, respectively, according to a 2006 Enterprise Management Associates survey. Indeed, before the decade's out, Microsoft is expected to add a hypervisor to Windows Vista, effectively putting a desktop virtualization platform into the box.

"Our idea is to have virtual services running virtual applications on a virtual OS—to be a 100 percent virtualized shop; that is where the industry and the market are progressing," says Scott Butcher, vice president and application integration manager of Bank of America Corp.'s Global Trading Infrastructure Trading Desktop Support group.

The company is evaluating desktop virtualization options, and has already virtualized more than 70 of its 5,000 applications. "Within the next year, we expect almost every single one of our applications to be virtualized," Butcher says. He plans over the next few months to determine cost savings from the effort.

Generally, desktop virtualization can be defined as a PC environment in which components—including operating systems and applications—execute in a protected area, isolated from the underlying hardware and software platform, says Natalie Lambert, a research analyst at IT advisor Forrester Research. Forms of desktop virtualization include hosted desktops, where desktop environments are hosted remotely in a data center, either on a server (using Windows Terminal Services or virtual machines) or blade PC; and desktop virtualization platforms, where virtual machines are hosted directly on users' PCs.

Application virtualization, where an application communicates with the host operating system through a virtualization layer, provides a way for businesses to execute an application in isolation. Coupled typically with application streaming as a software deployment mechanism, it lets users get the applications they need on demand wherever they are. With streaming, users can access cached applications even when they're not connected to a network.

To date, it's been more likely for organizations to deploy one of these technologies to satisfy niche requirements:

  • A financial services company might assign blade PCs to its high-powered users, such as traders who require secure but dedicated hardware, and who might otherwise run applications on multiple boxes under their desks.
  • Another company might give access to a server-based application in a shared desktop environment via Windows Terminal Services to limited-task workers, like call center employees, using thin clients.
  • Techies might use a desktop virtualization platform (aka hosted virtualization) to run multiple operating systems simultaneously in virtual machines on a single system for testing scenarios.
  • Another organization might deploy application virtualization to address specific pain points, such as letting users run two different versions of the same program on their PCs.

Click here to download a PDF of our PC Virtualization fact sheet.

Next page: Making the Switch to PC Virtualization

Making the Switch to

PC Virtualization">

But the time seems right for CIOs to explore the possibility of providing a virtualized computing experience—in whatever form—to a larger swath of their end users. Hosted desktop virtualization based on virtual machines raises the bar on personalization and performance, for instance, making it a more attractive option for heterogeneous office workers and administrators than traditional server-based computing. "Companies are thinking that the equipment that has to be maintained at the desktop can be very simple—a thin client, or it can be a PC that just runs remote desktop access—so they'll save on desktop hardware costs, and maintenance costs, plus they can bring the same kind of availability and functionality to a virtual PC as they did to the [virtual] server," says John Sloan, an Info-Tech Research Group senior research analyst. Virtual machines can be moved from one box to another for high availability, for example.

Similarly, the move by vendors to deliver desktop virtualization platforms (or hosted virtualization) with manageability features makes this an intriguing option for those who need to quickly provide mobile and contract knowledge workers with a secure, corporate-approved desktop. "While that [virtual machine] image has all the attributes of a PC, it is actually a file, so you can drag it to any portable media," says Forrester's Lambert. These virtual machines can be deployed with corporate-approved applications and other requirements, as well as encryption, kill switches and policies that ensure that any virtual machines accessing corporate resources are up to spec in terms of virus protection and other features. Once deployed, IT doesn't have to worry about anything that goes on the PC outside the virtual machine. But when there's differentiated ownership—as there might be if a contractor uses an outside PC for a job—IT will have to purchase another Windows license from a retailer for the virtual machines it will be deploying.

Hosted virtualization may even end today's endless investment of IT's time and energy in trying to standardize hardware configurations. "When standardization is no longer a factor of hardware but of how you install software on [that hardware], then today's effort in stable platforms, in configuration stability and so on will become redundant," says Brian Gammage, a vice president and fellow at IT advisor Gartner.

Windows Vista migrations may drive more organizations to virtualize their clients or applications as well. Companies can run Vista on hosted desktops in the data center without upgrading client hardware that may not be able to support the new operating system, for example. Or, using a hosted virtualization platform, they could let users run a legacy application on a Windows XP virtual machine residing on a Vista PC. Others may take advantage of application virtualization, to make sure the new operating system image they roll out with their Vista upgrades stays lean and pristine.

But even CIOs not yet moving to Vista may find virtualization valuable for addressing other concerns, including security and compliance. At the U.S. Office of Patent and Trademark, 500 patent examiners are virtualized in two ways: They work at home, and they access their desktops remotely from the agency's data center servers, which use VMware Inc.'s Virtual Desktop Infrastructure, based on the company's ESX virtualization technology and running Windows XP.

Their desktop images, including applications and data that used to be on their C drives, now reside on a virtual C drive on a storage area network in the data center. Virtualization removes concerns about home workers' desktops "being out and loose," at the same time that it helps alleviate the problem of releasing applications across multiple boxes, each with its own interdependencies, says Patent Office chief technology officer Griffin Macy.

Businesses that move desktop environments to the data center also may find compliance audits easier, because they've limited copies of data, and restricted its location and access to it.

Ask Your Desktop Support Lead:
How many support calls and desk-side support trips result from application-related problems?

Ask Your Compliance Officer:
What gaps remain to ensure that data is secure and can be accessed only by authorized individuals?

Next page: Setting a Virtualization Strategy

Setting a Virtualization Strategy


Organizations may initially consider virtualization technologies to address tactical concerns, such as the need to cut costs, but they should also consider how virtualization technologies can serve more strategic goals.

At Bank of America, Butcher isn't blind to the savings that can accrue from virtualization, but he sees that in the context of bigger-picture requirements to improve SLAs for critical applications and keep operations running in the event of a disaster. Virtualizing applications using Altiris SVS (recently acquired by Symantec Corp.), and streaming them to the desktop using AppStream, addresses the issues, he says. The application lifecycle delivery process is being crunched from two weeks to just days, and users have immediate access to their applications from any machine.

It's an advantage to be able to "virtualize the applications at hot or cold or makeshift sites, and have users provisioned for the application," Butcher says. "We don't need a one-for-one box that's maintained for the users."

Not everyone sees virtualization's potential at first, though. Forrester's Lambert says clients often come to her to discuss routine issues around client systems management or remote user access, but "then you start having these conversations where you talk about what virtualization brings to the table, and it really opens eyes."

Eyes were indeed opened at the Patent Office, where desktop virtualization is now a lynchpin of the organization's goal to expand telework initiatives. It's not easy to find the highly educated engineers needed to qualify as patent examiners, and offering the work-at-home option can strengthen the agency's ability to attract and retain staff.

"We need highly skilled people," Macy says. "If [desktop virtualization] helps recruitment and retention, that's pretty profound." The agency satisfies remote patent examiners' needs for high performance for their heavily graphics-oriented applications and high availability with its virtual desktop infrastructure implementation: Users (a maximum of 16 per server) connect to their standalone virtual machines running Windows XP; the virtual machines can be moved to other servers automatically if a box goes down.

At the Patent Office, thick clients remain the computing platform for remote patent examiners. That's in contrast to the Wyse thin clients that, by the end of April, were due to replace each of 700 desktops at the home and remote offices of Amerisure Insurance, in the company's Citrix Presentation Server-based computing environment (Amerisure started with Presentation Server v3.0 and is now on v.4.0). Jack Wilson, enterprise architect and assistant vice president at Amerisure, came on board three years ago to help set strategy in an IT department forced to manage layer upon layer of technology, from mainframe to midrange to Unix and Windows systems.

"The way Citrix is normally deployed is tactically, to solve a specific problem versus as a strategy," Wilson says. But that piecemeal approach just leaves another tier of technology to maintain, he says, and that's not an efficient use of the midsize business's resources. Amerisure uses Citrix as a complete strategy that removes physical dependencies on its employees' ability to access systems or data. Wilson designed the architecture so eight identical Citrix servers each can handle 70 to 90 individual desktop sessions, and are load-balanced so every new sign-on goes to the least used server in the farm.

Next page: Gauging Success

Gauging Success

Along with keeping the Citrix servers identical for ease of support, Wilson's standardization plan lets any server in the farm accommodate everyone, from one-application infrequent users to multiple-application power users. All software goes through a governance process to justify its use, and is moved to production via an install process that clones the application across the entire Citrix server farm. No matter which server a user connects to, all that person's authorized applications are accessible.

One reason the virtualized desktop approach works for Amerisure is that the company's computing environment is relatively stable—its software is mostly well-behaving 32-bit Windows applications that are compatible with Terminal Services and needed few if any code revisions to run in a Citrix environment. The company did have to address some network bandwidth problems initially, but today, every user gets a clearly defined amount of bandwidth to start, and network bandwidth is being expanded with fiber.

But it can be a challenge to convince users to trade in their fat clients for thin ones. Amerisure addresses this by pointing out the positives: Citrix's "roaming profiles" feature lets users set individual preferences for the look of their desktop sessions, so they can maintain some personal control; and virtualization paves the way for an effective remote work strategy.

The payoff? When bad weather forced about a quarter of Amerisure's employees to stay home recently, no productivity was lost because most signed on remotely. What's more, Wilson says, the first thin client refresh won't be necessary for at least six to nine years—the company will save $3 million to $4 million by bypassing several refresh cycles and avoiding the business disruptions that typically accompany upgrades.

Ask Your Applications Team Lead:
What are the characteristics of all corporate-approved applications, including performance, availability and compatibility requirements?

Ask Your Senior Business Managers:
What objections might business users raise about giving up PCs for thin clients?

Next page: From Plan into Practice

From Plan into Practice


Virtualization drives users to abstraction—literally—but some might say it drives them to distraction, which makes it critical for IT to streamline the computing experience.

Russell Investment Group, a global investment firm with $2.4 trillion in U.S. assets, is switching gears in its quest to deliver applications to users as a service, says Greg Nelson, senior technology consultant in the company's Technology Consulting Services architecture group, whose office reports to the CIO. By year's end, Russell will have nearly completed its switchover from a centralized Terminal Services-based delivery model to using Microsoft's application virtualization and streaming platform as the primary software delivery method to some 2,500 desktops in 13 offices worldwide.

Russell had originally turned to server-based computing to deliver consistent service at lower cost and anywhere computing to a staff that is 90 percent knowledge workers. But one problem with centralized computing is that the delays that sometimes occur during normal business processes become magnified when users aren't isolated from each other's behavior. Yet, it became clear that many of the efficiencies Russell sought could be found in the server-based computing environment, which employed Microsoft's Softgrid application virtualization tool. "Everything users like about the desktop—snappy, good video performance, audio, etc.—is provided, and at the same cost model and cost savings we've gotten when we've done centralized computing," Nelson says.

In combination with streaming technology, Russell relies on a self-service model that lets authorized users pick applications off a list for immediate availability, seemingly increasing IT's responsiveness to business needs.

Still, some kinks linger. For instance, Microsoft Office is part of the base operating system image, so virtualized applications can tie into it—currently, though, there's no way for one virtualized application to link to another.

Other organizations are working out user connections to desktops hosted in the data center. The logic to dynamically map users to virtual machines and images—or broker connections—hasn't been perfected with the virtual desktop infrastructure yet. "That brokering logic today needs customizing to make it work, with some manual intervention," says Gartner's Gammage.

Collier County Public Schools in Naples, Fla., is moving to a virtualized model—22,000 thin clients and VMware ESX Server on 508 HP blade servers. The district started with a direct connect model but no connection broker, just hard-coding the IP address in the thin client to connect with a virtual machine image. "Now we think it's a better approach to develop a connection broker and Web portal," says school district technology director Tom Petry, "so users' desktops will be wherever the users are." The school district worked with VMware's consulting group on the connection broker project, and is now finishing quality assurance testing. But Petry says he was disappointed by delays along the way.

Despite obstacles, however, this is the time for companies to start integrating virtualization into their plans. "The road map has to be designed to support change," Gammage says. Client computing isn't just about buying PCs for users any more. It's about evolving your infrastructure to meet users' changing requirements.

Ask Your Help Desk:
How long does it take from the time a user requests an application until it is delivered to the desktop?

Tell Your Enterprise Architect:
Design the roadmap to accommodate changing user needs.