|By Ryan Hughes||
|August 20, 2012 06:00 AM EDT||
The impact that Cloud Computing has brought to the IT industry to date has been primarily beneficial to application developers, system admins, and network architects, and not directly to end-users of technology.
Yes, IT developers and architects leverage cloud computing’s flexible and virtualized compute, storage, and network infrastructure to build resilient applications that eventually benefit end users due to improvements in speed-to-market and improved up-time statistics, but the direct benefits to the tech-needy end user are still rarely recognized.
Most daily users of personal and business class applications don’t have the turnkey, on-demand access to the applications they need. At work, their IT departments at work are too slow in delivering the apps they need or refuse to provide them due to cost, limited resources, or lack of recognized need. At home, users struggle to deploy software themselves due to complexity, time involved, or again, cost.
However, advances in cloud-powered software and service delivery have started to revolutionize the way that end-users (both business and consumer-level) think about acquiring the tools they need to succeed. These innovations will finally give end-users with their piece of cloud computing value and change the way software is delivered, licensed, and used both on-line and off-line. Over the next several weeks, I will be releasing several blog posts on the topic of the "Future of Cloud Computing". Below is Part 1, which describes the unrealized promise and eventual demise of virtual desktops.
Innovations in streaming application code… rather than streaming pixels… will kill VDI before it even fully arrives.
Do users really like or want Virtual Desktops? From the start, the concept of virtual desktop infrastructure (VDI) is flawed for most real-world applications and use-cases. No matter how optimized VDI compression companies claim their proprietary algorithms might be, they are still trying to push a proverbial “watermelon of pixels” though a relatively pinhole-size network to get what you need to your device. It almost seems like all the stars have to align before VDI actually works for the every day, multi-location worker.
VDI technology refresher
Virtual Desktop Infrastructure (VDI) is a method of enabling end-users with a client device (PC, laptop, tablet, etc.) to access, log-in, and utilize a remotely hosted desktop environment. In order for you to get access to and interact with the remote environment, compressed screen shots of the display (what you would see if you were standing in front of a monitor) of the VDI instance are streamed continuously via network connection to your client device’s display screen. Meaning, that a user could have access to a completely different environment including OS, applications, and network without actually having that environment installed on their physical client device.
What’s so wrong with VDI now?
For the typical everyday business user, who works from a combination of office, home, client-site and car, using a virtual desktop sounds perfect, but in actual practice, it’s a real productivity killer due to several key flaws.
- No offline access. VDI requires a persistent high-speed internet connection throughout the entire session of usage on your virtual desktop… and while wifi is supposedly everywhere, it never seems to never be reliable, fast, nor secure… making accessing your applications and data “anywhere and anytime” more of an under-delivered promise than a reliable reality.
- Not as "Green" as advertised. For all its press about being “green”, VDI is actually incredibly wasteful because it is architected to leverage only the compute and storage of a hosted server or cloud environment, while completely ignoring the processing and storage power of your personal PC or tablet client-side device. With the exception of true “thin-clients”, which are not widely used by consumer nor businesses to-date because of their inflexibility to be used for anything expect for VDI, your client device, whether it be a desktop or laptop PC, is still powered on and consuming a similar amount of power as it would if you were utilizing its local resources rather than just viewing the streamed screen shots of your VDI instance to your device’s display. Powerful client-side (e.g. PC, Mac) devices are so relatively cheap, yet are virtually (no pun intended) wasted when leveraging VDI.
- Performance and Graphic degradation. VDI struggles with graphic intense applications like Engineering, Drawing, CAD, GIS, and Gaming applications because most cannot use the device's local graphic card to render complex or fast-moving graphics locally rather than streaming non-3D and/or pixelated graphics from the VDI instance.
- Cost. A typical private VDI environment set up from a leading vendor is easily into the millions of dollars after accounting for new data center space, servers, networking, storage, and virtualization licensing. A large price to pay to duplicate and even derogate some of the applications and services that your users are currently using.
How were we convinced streaming screenshots was "the right way" anyhow?
Undoubtedly there are benefits of VDI, but most of the benefits are to the IT staff, not end-users. Most of these benefits to the IT staff surround topics of license management, patching, and security. Although I understand these benefits, I don't know how IT shops got on the path of streaming pixels with VDI rather serving the code instead which would allow them to better optimize and control application delivery and licensing than what streaming screenshots could.
Using the server-side to deliver application functionality, data, and licensing on-demand to devices directly
Sending pieces of the code to your device, using your local device’s compute processing to run it, and then getting updates pushed from the mothership server whenever you connect or security requires it seems like a much more streamlined approach to a VDI-like environment than relying on a high-speed connection to stream pictures of screenshots from a remote data center slice of a server. In this scenario, IT admins still get all the manageability benefits and licensing controls for deploying applications on-demand that they get from VDI… all without spinning up an entire cloud infrastructure to host a VDI backend and without wasting perfectly good client-side resources.
How to replace VDI... Streaming application code, not pixels
• Any software delivered to your own device
• VDI-like features still present- updates/patches pushed, zero-footprint device wiping
• Fast, reliable, offline-accessible local storage and processing (w/admin approval)
• Native graphics performance for CAD, GIS, Visualization, Gaming, etc.
The next step - making "Cloud-bursting" workloads a reality
• Application code, data, and compute are on local device & cloud for ultimate workload flexibility.
• Local & Cloud Storage Sync for redundancy and faster processing by chosen processing destination
• Local & Cloud Processing Capabilities – Cloud-bursting a workload becomes a reality
• Native Graphics Performance
More Advantages of streaming application code rather than pixels
- Applications and data can live on both you local device and the cloud; enabling you to "cloud-burst" large jobs
- Enables you to choose where you process your requests, choose the location, speed, and even cost of your processing jobs
- More flexible and functionality-based licensing terms
- Stream apps to first-responders in disaster response situations, then remote wipe once tasks complete
- Sales teams can easily give customer's full trials with automatic licensing time-bombs
- Create SaaS-like easy deployment without changing a single thing about your successful legacy desktop applications
- Similar benefits to traditional VDI for application updates, bulk maintenance, and security
- Admins are still administering one application package for everyone to use
- Can auto-push critical security patches or application updates
- Enables offline usage
- Since the application code runs on the client device, with admin approval, user can take the application off-line indefinitely or, using time-bomb or usage-bomb licensing, admins can limit usage of the application for a certain period of time or for a specific task only.
- Extends the life of Desktop applications
- Traditional "boxed" software companies are spending millions of dollars and years of R&D time to re-engineer their software “for the cloud” because they think they only way to cloud-enable their software is to write from scratch a multi-tenant web application that recreates their technology’s traditional functionality. However, the usual outcome of this new SaaS development is watered-down, bug-ridden functions compared to their flagship desktop product functions
- Less risk of software piracy
- Since only the application code for the functions you need is being streamed to you, your computer will never have full application code; making it much harder if not impossible to pirate, re-package, and re-sell and full pirated version of the software.
- Superior application performance and 3D graphics rendering
- When you stream code to the device instead of pixels, it could remedy probably the biggest problem in VDI, application performance and graphics rendering.
- This enables entire industries like CAD, Mapping (Geospatial, GIS), Gaming, and more to become usable and controllable, rather than becoming “IT silos” that get treated managed, updated, and secured differently than other non-graphic intense applications.
Although the technology to pull off this type of code-streaming environment might not be full baked yet, the groundwork for replacing pixel-streaming VDI has already been laid. As the cost drops for cutting edge client devices and their amazing processing and graphics capabilities continue to wow customers and set expectations on user experience, VDI implementations will continue fail at achieving their once great promise to stream any application to every user via only a web connection. It seems that VDI is perhaps only a patch-over solution while we wait on something better to come about. Code streaming to client devices may be that answer.Watch for my upcoming post: The Future of Cloud Computing - Part 2: Why PaaS will fail and how Software-Stacks-as-a-Service (SSaaS) will replace it.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
Nov. 26, 2014 11:45 PM EST Reads: 1,043
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
Nov. 26, 2014 11:30 PM EST Reads: 930
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
Nov. 26, 2014 09:00 PM EST Reads: 1,000
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
Nov. 26, 2014 07:00 PM EST Reads: 994
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
Nov. 26, 2014 06:00 PM EST Reads: 997
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 26, 2014 05:45 PM EST Reads: 937
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Nov. 26, 2014 04:00 PM EST Reads: 1,034
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people's real needs and desires.
Nov. 26, 2014 03:45 PM EST Reads: 998
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
Nov. 26, 2014 02:00 PM EST Reads: 1,474
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Nov. 25, 2014 09:30 PM EST Reads: 1,232
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
Nov. 25, 2014 09:30 PM EST Reads: 1,282
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Nov. 25, 2014 07:00 PM EST Reads: 1,320
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
Nov. 25, 2014 04:30 PM EST Reads: 1,325
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Nov. 24, 2014 07:00 PM EST Reads: 1,630
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
Nov. 24, 2014 12:00 PM EST Reads: 1,516
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Nov. 24, 2014 11:00 AM EST Reads: 1,656
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
Nov. 24, 2014 09:00 AM EST Reads: 1,672
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
Nov. 23, 2014 07:30 PM EST Reads: 1,838
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 23, 2014 12:00 PM EST Reads: 1,790
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
Nov. 23, 2014 07:45 AM EST Reads: 1,820