Virtual Desktop Infrastructure (VDI) came onto the scene just over a decade ago, and businesses got excited about providing desktop functionality for a lower cost. Most of the excitement faded, though, as companies began to implement VDI. Infrastructure became the focus due to VDI solution complexity, and that meant that any promised cost savings went right out the window. Today, traditional VDI technology is outdated, and it’s time for a modern approach to delivering virtual desktops, apps and workstations.
The Sizzle…and the Fizzle
For decades, IT teams have been tasked with buying physical desktops; they look at how much CPU, memory and storage the units have, they decide which configuration is best suited for their users, they place the order and away they go. Why should purchasing virtual desktops be any different? Shouldn’t IT staff be focused on those same desktop attributes and what meets user requirements best? Instead, they are forced to spend most of their time thinking about and grappling with "infrastructure": Servers, storage, layers, management tools and much more. Managing all this infrastructure is exhausting and expensive, and when the focus is on the "I" and not on the "D," users end up unhappy and IT staff end up frustrated and overworked.
How did this happen? How did we lose the “D” in VDI? Well, IT spends about $1500 per desktop (or laptop). That cost is amortized over three to four years. But as it turns out, the overhead of dealing with physical desktops is often unsustainable, and once the world went mobile, if your users were still tethered to a PC that was a sure way to give your competitors the advantage. In response, some IT teams made the decision to deploy virtual desktops and apps.
The promise of enabling a mobile workforce to be productive anywhere, while making IT more efficient and fortifying information security is certainly compelling. But in order to implement VDI on-premises, IT needs to translate desktop attributes into expensive and complex data center technologies. IT staff started asking questions such as, “If I have 1,000 users, how many servers do I need? How much shared SAN/NAS storage do I need? In which data centers do I put this infrastructure?”
IT has to figure out how many servers will be needed, so it has to assess the organization’s desktop usage. They will ask questions like, “Which applications are used? What is the CPU and memory usage rate? How many users can I fit onto a certain class of servers? Based on these use cases, do I need 20, 30 or 50 servers for 1000 users?” It all depends on usage.
IT teams will also have to figure out storage needs, which has proven even harder. Local storage on PCs is the cheapest storage available – about $100/TB. SAN/NAS can be 25-100 times that cost. If each user had 1TB of storage on their desktop, then you would need 1,000 TB of SAN/NAS. That is extremely expensive.
Recognizing that this could hurt VDI adoption, providers devised ways to optimize storage. The conversation went something like this: “Oh, you can optimize with a single image so you don't need to have 1000 copies of Windows OS. Now, let's put in layers so you don't need to have 1000 copies of each application. Wait, what about profile management tools to store end user personalization? You need that, too. Oh, and you can no longer manage it with your existing PC management tools like SCCM and Altiris (News - Alert). So, your VDI infrastructure is a stand-alone management framework.”
These all appear to be logical solutions, but here’s the problem: Windows wasn't architected to operate this way, so customers struggle with app compatibility, corrupted profiles and application updates that blow away desktops. At the same time, the storage vendors started implementing de-duplication so that the 1000 copies of Windows and applications in each user's desktop were automatically de-duped at the storage layer. Hyper-converged infrastructure (HCI) vendors ultimately adopted de-duplication, and even though HCI really began to affect the cost of VDI implementations, it hasn't gone far enough.
You’ve figured out how many servers and how much storage your organization will need; now you have to think about where all this infrastructure is going to live. Which data center should it be in? How far away will all your end users be from that data center? What does that mean for latency? What will users experience? How much bandwidth will they require?
VDI, Meet the Cloud
Notice that there’s been little mention of the desktop so far. That’s because all your time is consumed with infrastructure complexity! IT departments have had to jump through complex infrastructural hoops to deliver a mission-critical workload to a class of users. But IT teams have way more important things to do and more value to add than dealing with all this complexity.
Cloud computing puts the “D” back in VDI by eliminating complexity. It’s an opportunity to completely re-imagine what the phrase “virtual desktops” means. Now, the “data center” is any region of the public cloud you select. Essentially, the infrastructure becomes invisible in that region – at least in terms of you having to worry about it. Virtual desktops can be placed close to your users so they have a great experience. All IT needs to do is determine the configuration of the desktop, just like they used to determine the configuration of a physical PC.
This makes buying a cloud-based VDI solution quite similar to—and in fact, simpler than—buying a PC. A cloud-based virtual desktop service allows the IT team to simply choose a desktop configuration. They order the number of units needed for their end users and specify one or more public cloud regions. Then they apply their corporate image to the virtual desktops. But rather than shipping PCs all over the place, IT simply emails a desktop link to each user. Even better? This can happen in a day.
With the cloud, VDI is hot again, and the excitement is all about the ease of delivering all those benefits that have been promised for so long. VDI complexity is eliminated, so instead of being mired in infrastructure, you and your IT team can get out of the business of “keeping the lights on” and pivot to more strategic projects that yield additional business value.
About the author: Amitabh Sinha has more than 20 years of experience across enterprise software, end user computing, mobile, and database software. Amitabh co-founded Workspot with Puneet Chawla and Ty Wang in August 2012. Prior to Workspot, Amitabh was the general manager for enterprise desktops and Apps at Citrix Systems (News - Alert). In his five years at Citrix, Amitabh was vice president of product management for XenDesktop and vice president of engineering for the Advanced Solutions Group. Amitabh has a Ph.D. in computer science from the University of Illinois, Urbana-Champaign.