The No. 1 rule for implementing desktop virtualization

User experience is a crucial factor in the success or failure of a desktop virtualization project

To get the most out of server virtualization, you have to plan, but the intended goals of data center consolidation and the resulting cost savings are relatively simple to outline and work toward. For desktop virtualization, it’s not as straightforward.

One motive often mentioned for desktop virtualization is to provide for more centralized management of devices, both desktop and mobile. That should make it easier to install security patches and operating system upgrades and ensure that users have the same image to work from no matter which device they use.

What people don’t realize is that you still have the same management problems, said Susie Adams, chief technology officer for Microsoft’s federal government business.

“You just switch where those problems reside, and with desktop virtualization comes a pretty hefty infrastructure to support it,” she said. “You need network bandwidth and the hardware and other equipment to host all of those simultaneous users through a network access connection, so it’s not as cut and dried as it might seem.”

User experience will also be a crucial factor in the success or failure of a desktop virtualization project. Users might react negatively to having resources taken away from their direct control and put under centralized management, and any change to the look and feel of the desktop image they use could generate pushback. Agencies will have to provide significant support, at least for a time.

Other considerations are data management and the ways activities such as data backup and data loss prevention are handled. There will be differences in the way the network needs to handle the data, said Jim Smid, Chief Technology Officer at Iron Bow Technologies. So the network, servers, endpoints and other components will have to be tweaked to ensure that there’s no degradation in overall performance.

“Again, with desktop virtualization it’s all about the end-user experience,” he said. “They can’t suffer from a loss in performance or feature functionality.”

The best targets for desktop virtualization, at least initially, are areas where desktop images are very similar. For example, users who typically work with Microsoft Office, Internet Explorer and not much else are the “classic, easy hits,” said Mark Weber, president of NetApp’s U.S. Public Sector.

“When you are rolling out 1,000 seats and the image you are putting on each of them is the same, that’s a beautiful target for desktop visualization,” he said. “With that you’re going to get huge savings on such things as storage, security and reliable backups.”

Beyond those easy savings, however, things can get more complicated, so agencies must know why they want to deploy desktop virtualization and they must be specific about it. In many cases, they might find they can achieve the same results in another, simpler way.

Adams said people look to desktop virtualization to help them with broad themes such as remote access management, but in many instances, it proves to be overkill. Other people talk about wanting to centrally manage all of their desktops, “and that’s also overkill,” she said.

“What agencies need to do before they deploy desktop virtualization is get a handle on categorizing their information workers and the devices they need to do their jobs and what information they are going to allow those workers to access,” she said. “Then they can decide that, if there’s a particular category of work they need to do that’s only appropriate from a virtualized desktop, they get that capability. But if they don’t need it, they don’t get it.”

The No. 1 rule for implementing desktop virtualization is to start small. Do an assessment of the environment to discover the best targets for virtualization, then make a step-by-step plan and test a proof-of-concept through a small pilot project so you can see where the problems might lie and if the solution you have chosen will scale up to accommodate more users.

“At the end of the day, the most important thing is to retain that user experience,” Smid said. “Being able to show that is the only sure way of getting customer buy-in.”

About this Report

This report was commissioned by the Content Solutions unit, an independent editorial arm of 1105 Government Information Group. Specific topics are chosen in response to interest from the vendor community; however, sponsors are not guaranteed content contribution or review of content before publication. For more information about 1105 Government Information Group Content Solutions, please email us at [email protected]