Go virtual with desktop virtualisation

Desktop virtualisation is becoming a hot topic in the Middle East region as it is reducing hardware costs and giving control and consistency in the enterprise network

Tags: Brocade (www.brocade.com)Cisco-LinksysFrost and SullivanGulf Business MachinesHuawei Technologies Company
  • E-Mail
Go virtual with desktop virtualisation Amit Mathur, Huawei, says that data security is an issue which VDI can address through centralised control.
More pics ›
By  Piers Ford Published  August 18, 2013

Desktop virtualisation is becoming a hot topic in the Middle East region as it is reducing hardware costs and giving control and consistency in the enterprise network writes Piers Ford

Desktop virtualisation has come of age in the Middle East as many organisations wake up to the benefits of reduced hardware costs and streamlined access to applications and systems in a highly managed virtual environment.

Some are going down the local route, with applications running on clients using hardware virtualisation. Others are taking the remote approach, in which the operating system is hosted on a dedicated virtual machine and links to the client over a network.

Regardless of which approach is more appropriate, IT decision makers are increasingly enthusiastic about a model that allows them to impose a much higher degree of control and consistency over complex, multi-device environments.

“Desktop virtualisation is based on the concept of separating the operating system [and applications running on it] from the physical host computer and moving it into a shared, centralised environment of servers,” explained Ihab al Saheli, general manager at systems integrator and managed services provider CNS.

“Data from the centralised environment is streamed to clients through the network using protocols like Remote Desktop Protocol and PC over IP.

“Desktop virtualisation eventually acts like a client/server computing model, as application executions happen on a remote place that is linked through the network to a local client. Applications, operating systems, data and settings remain on the centralised environment and only the keyboard mouse and display information is transferred to the local clients. These can be dedicated special thin clients or they can be standard desktops, laptops or even tablets.”

Al Saheli said it is common to host several multiple desktops in one server platform using a hypervisor, a model commonly known as a Virtual Desktop Infrastructure (VDI).

Cost effectiveness

What lies behind this sudden popularity? To a great extent, it is VDI’s cost effectiveness, something which, according to Haritha Ramachandran, industry manager at Frost & Sullivan’s information and communication technologies practice, is driving adoption.

“IT infrastructure such as PC procurement, provisioning, management and maintenance are significant cost components of the overall IT spend for an enterprise,” she said. “Besides, smaller infrastructure means smaller utility bills, which add up over the course of the year. Therefore, the main reason behind the success story of desktop virtualisation is reduction in operational cost.

“Also, from the perspective of IT managers, desktop virtualisation allows them to have better control of the IT infrastructure, which results in improved end-point manageability and enhanced security. Desktop virtualisation is clearly another virtue in the age of Bring Your Own Device [BYOD].”

At industry giant Cisco, which has long made virtualisation a cornerstone of its core architectures including borderless networks, data centre and collaboration, data centre leader for UAE Mark Hosking described VDI as the ‘perfect storm’ that brings everything together.

Apart from the ability to maintain consistent desktop management while reducing hardware refresh costs – he cited a CIO from a global banking business who has said he won’t be spending a single dollar on desktop infrastructure in the coming year, Hosking said security and flexibility are the other big drivers.

“Desktop virtualisation allows you to mitigate data leakage and manage compliance across a vast estate of PCs and laptops,” he said. “Once the device is turned off, it is useless because all the data is in the data centre.

“As far as flexibility is concerned, employees can access their corporate desktops from anywhere via any device. So the company sees productivity benefits. But the overriding business benefit is that virtualisation delivers agility to the enterprise. The IT department can respond more quickly to the operational needs of the business.”

Potential drawbacks
However, there are some important considerations to take into account before launching gung-ho into a VDI strategy. And they highlight the potential drawbacks of the model for a business or infrastructure that is not adequately prepared.

As Hosking explained, VDI relies on a tunnel through the network, connecting the virtual desktop client to the data centre. The network must be able to understand and prioritise virtual desktop traffic because, in a multi-media environment, the end-user won’t tolerate a poor-quality experience.

“With a Hosted Virtual Desktop, network availability and quality becomes a vital dependency influencing the quality of the end-user experience,” said Nathan Hill, research director at analyst Gartner.

“Unlike the traditional PC architecture, where local productivity tools are still usable in the absence of network connectivity from a user’s client device to their hosted desktop, the HVD architecture provides no functionality. As such, this architecture is generally a poor fit for highly mobile uses that require offline and disconnected access.

“Network bandwidth and performance will vary considerably with workload, low for fairly static office applications and high for multimedia intensive applications,” explained Hill. “A good rule of thumb for planning average bandwidth consumption is 100-200kbps for each HVD session, but this will increase with multimedia requirements, for example, video at HD resolution can increase bandwidth consumption as high as 5Mbps.

“Bandwidth is one aspect, but latency tends to be more important from a performance and user satisfaction perspective. Performance tends to degrade above 100ms latency and is more noticeable with multimedia content or high I/O requirements, for example word processing, with 150ms useful as a benchmark beyond which some users will be less satisfied with performance or reject the technology. This is often a subjective area of end-user satisfaction, so it’s hard to generalise without conducting a test or pilot in a customer’s environment. Certainly latency over 200ms is likely to be unacceptable in most scenarios.”

Add a Comment

Your display name This field is mandatory

Your e-mail address This field is mandatory (Your e-mail address won't be published)

Security code