infoTECH Feature

November 12, 2014

Making the Cloud Feel Local

By TMCnet Special Guest
Steve Riley, Technical Leader in Office of CTO, Riverbed

By 2017, two-thirds of all workloads will be processed in the cloud.

What workloads? The applications utilized everyday by businesses globally—email, document sharing and storage, collaboration, ERP, CRM, and more. These critical workloads will crush businesses if they aren't secure, available, and resilient.

Local data centers have always offered unprecedented management and visibility into critical workloads, but now that those workloads are being transitioned to the cloud, how are businesses actively managing them in order to ensure the applications are performing?

Keeping resources local has been an aspect of the IT landscape for most of its history. Regulatory stipulations or security policies might require that information be contained within certain national borders. Performance expectations might require that data be as close to users as possible. Application platforms might offer features only in a subset of geographical regions. In all these cases, and in similar ones, location matters because it’s a constraint that must be accommodated somehow. Location often limits where and what we might otherwise wish to do with our information.

Or, more accurately, location has historically been a constraint. The industry is at the threshold of a significant change—a change that is poised to eliminate this constraint from most decisions. With the help of modern tools designed for this purpose, IT organizations can liberate themselves from the limits of distance and location. Applications and data can be placed wherever is optimal for the business and can be quickly moved around as necessary. Users will experience no performance degradation. Administrators and developers can retain visibility into application behavior regardless of how much distance might separate them.

Indeed, one can think that location is transforming from a constraint into a feature—organizations can individually optimize for location of data, applications, and people, and achieve competitive advantage as a result. The ubiquity of the cloud makes this possible.

We’ve all experienced slow computers and slow access to data. But in business it can be as inefficient as it is frustrating. For example, a slow data network and a very slow Internet connection discouraged a not-for-profit-job recruitment agency’s employees. When the company decided to use the cloud for Office 365 and other business-critical applications, the CIO wanted employee applications to operate as if their files were stored locally. Add in the complication that the Office 365 hosting site was located in Singapore, nearly 4,000 miles away from the agency’s Sydney headquarters, and the challenges compounded.

Now imagine that location didn’t matter and that technology could create the appearance of bending the laws of physics a little bit. IT managers could tap into the cloud as efficiently as if all its resources were stored in the next room. Applications could run at full speed, making employees more productive, regardless of where their applications and data were stored.

The next chapter of the IT story is location-independent computing. Underpinned by technologies such as data deduplication, bandwidth shaping, latency mitigation, and encryption, location-independent computing will make the promise of the cloud a reality by liberating your workload from the tyrannies of place and distance.

Recommendations

Enterprises need an infrastructure strategy that allows them to deploy applications anywhere to users while guaranteeing a consistently high quality experience in hybrid IT environments. How? A holistic approach that focuses on these three dimensions:

  • Analyze. To solve performance problems quickly and plan for architectural changes, enterprises need to collect, synthesize, and process vast amounts of data about applications and the network.
  • Diagnose. Once problems are analyzed, powerful diagnosis and discovery can determine the best solutions that will mitigate the problems quickly and with minimal disruption.
  • Resolve. After diagnosis is complete, the solutions need to be implemented efficiently and cost effectively.

Steve Riley is Technical Leader in the Office of the CTO at Riverbed (News - Alert) Technology. His specialties include information security, compliance, privacy, and policy. Steve has spoken at hundreds of events around the world. He is co-author of Protect Your Windows Network, contributed a chapter to Auditing Cloud Computing, has published numerous articles, and conducted technical reviews of several data networking and telecommunications books. Before Steve joined Riverbed, he was the cloud security evangelist at Amazon Web Services (News - Alert) and a security consultant and advisor at Microsoft. 

Edited by Stefania Viscusi
FOLLOW US

Subscribe to InfoTECH Spotlight eNews

InfoTECH Spotlight eNews delivers the latest news impacting technology in the IT industry each week. Sign up to receive FREE breaking news today!
FREE eNewsletter

infoTECH Whitepapers