|By Lori MacVittie||
|June 16, 2012 09:00 AM EDT||
A recent post on the HTTP 2.0 War beginning garnered a very relevant question regarding WebSockets and where it fits in (what might shape up to be) an epic battle.
The answer to the question, “Why not consider WebSockets here?” could be easily answered with two words: HTTP headers. It could also be answered with two other words: infrastructure impact.
But I’m guessing Nagesh (and others) would like a bit more detail on that, so here comes the (computer) science.
Different Solutions Have Different Impacts
Due to a simple (and yet profound) difference between the two implementations, WebSockets is less likely to make an impact on the web (and yet more likely to make an impact inside data centers, but more on that another time). Nagesh is correct in that in almost all the important aspects, WebSockets and SPDY are identical (if not in implementation, in effect). Both are asynchronous, which eliminates the overhead of “polling” generally used to simulate “real time” updates a la Web 2.0 applications. Both use only a single TCP connection. This also reduces overhead on servers (and infrastructure) which can translate into better performance for the end-user. Both can make use of compression (although only via extensions in the case of WebSockets) to reduce size of data transferred resulting, one hopes, in better performance, particularly over more constrained mobile networks.
Both protocols operate “outside” HTTP and use an upgrade mechanism to initiate. While WebSockets uses the HTTP connection header to request an upgrade, SPDY uses the Next Protocol Negotiation (proposed enhancement to the TLS specification). This mechanism engenders better backwards-compatibility across the web, allowing sites to support both next-generation web applications as well as traditional HTTP.
Both specifications are designed, as pointed out, to solve the same problems. And both do, in theory and in practice. The difference lies in the HTTP headers – or lack thereof in the case of WebSockets.
Once established, WebSocket data frames can be sent back and forth between the client and the server in full-duplex mode. Both text and binary frames can be sent full-duplex, in either direction at the same time. The data is minimally framed with just two bytes. In the case of text frames, each frame starts with a 0x00 byte, ends with a 0xFF byte, and contains UTF-8 data in between. WebSocket text frames use a terminator, while binary frames use a length prefix.
WebSockets does not use HTTP headers, SPDY does. This seemingly simple difference has an inversely proportional impact on supporting infrastructure.
The Impact on Infrastructure
The impact on infrastructure is why WebSockets may be more trouble than its worth – at least when it comes to public-facing web applications. While both specifications will require gateway translation services until (if) they are fully adopted, WebSockets has a much harsher impact on the intervening infrastructure than does SPDY.
WebSockets effectively blinds infrastructure. IDS, IPS, ADC, firewalls, anti-virus scanners – any service which relies upon HTTP headers to determine specific content type or location (URI) of the object being requested – is unable to inspect or validate requests due to its lack of HTTP headers. Now, SPDY doesn’t make it easy – HTTP request headers are compressed – but it doesn’t make it nearly as hard, because gzip is pretty well understood and even intermediate infrastructure can deflate and recompress with relative ease (and without needing special data, such as is the case with SSL/TLS and certificates).
Let me stop for a moment and shamelessly quote myself from a blog on this very subject, “Oops! HTML5 Does it Again”:
One of the things WebSockets does to dramatically improve performance is eliminate all those pesky HTTP headers. You know, things like CONTENT-TYPE. You know, the header that tells the endpoint what kind of content is being transferred, such as text/html and video/avi. One of the things anti-virus and malware scanning solutions are very good at is detecting anomalies in specific types of content. The problem is that without a MIME type, the ability to correctly identify a given object gets a bit iffy. Bits and bytes are bytes and bytes, and while you could certainly infer the type based on format “tells” within the actual data, how would you really know? Sure, the HTTP headers could by lying, but generally speaking the application serving the object doesn’t lie about the type of data and it is a rare vulnerability that attempts to manipulate that value. After all, you want a malicious payload delivered via a specific medium, because that’s the cornerstone upon which many exploits are based – execution of a specific operation against a specific manipulated payload. That means you really need the endpoint to believe the content is of the type it thinks it is.
But couldn’t you just use the URL? Nope – there is no URL associated with objects via a WebSocket. There is also no standard application information that next-generation firewalls can use to differentiate the content; developers are free to innovate and create their own formats and micro-formats, and undoubtedly will. And trying to prevent its use is nigh-unto impossible because of the way in which the upgrade handshake is performed – it’s all over HTTP, and stays HTTP. One minute the session is talking understandable HTTP, the next they’re whispering in Lakota, a traditionally oral-only language which neatly illustrates the overarching point of this post thus far: there’s no way to confidently know what is being passed over a WebSocket unless you “speak” the language used, which you may or may not have access to.
The result of all this confusion is that security software designed to scan for specific signatures or anomalies within specific types of content can’t. They can’t extract the object flowing through a WebSocket because there’s no indication of where it begins or ends, or even what it is. The loss of HTTP headers that indicate not only type but length is problematic for any software – or hardware for that matter – that uses the information contained within to extract and process the data.
SPDY, however, does not eliminate these Very-Important-to-Infrastructure-Services HTTP headers, it merely compresses them. Which makes SPDY a much more compelling option than WebSockets. SPDY can be enabled for an entire data center via the use of a single component: a SPDY gateway. WebSockets ostensibly requires the upgrade or replacement of many more infrastructure services and introduces risks that may be unacceptable to many organizations.
And thus my answer to the question "Why not consider WebSockets here” is simply that the end-result (better performance) of implementing the two may be the same, WebSockets is unlikely to gain widespread acceptance as the protocol du jour for public facing web applications due to the operational burden it imposes on the rest of the infrastructure.
That doesn’t mean it won’t gain widespread acceptance inside the enterprise. But that’s a topic for another day…
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Aug. 29, 2016 08:00 AM EDT Reads: 916
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Aug. 29, 2016 07:30 AM EDT Reads: 801
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 29, 2016 02:15 AM EDT Reads: 1,828
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 29, 2016 01:45 AM EDT Reads: 2,191
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Aug. 29, 2016 01:15 AM EDT Reads: 3,021
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Aug. 29, 2016 12:00 AM EDT Reads: 1,908
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 28, 2016 10:30 PM EDT Reads: 4,070
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
Aug. 28, 2016 06:30 PM EDT Reads: 1,644
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 28, 2016 06:00 PM EDT Reads: 1,957
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Aug. 28, 2016 01:45 PM EDT Reads: 3,707
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 28, 2016 01:00 PM EDT Reads: 2,460
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Aug. 28, 2016 11:30 AM EDT Reads: 2,001
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 28, 2016 11:00 AM EDT Reads: 3,169
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Aug. 28, 2016 11:00 AM EDT Reads: 754
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Aug. 27, 2016 08:45 PM EDT Reads: 2,423
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Aug. 27, 2016 12:45 PM EDT Reads: 2,397
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
Aug. 27, 2016 02:30 AM EDT Reads: 2,081
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Aug. 25, 2016 05:15 PM EDT Reads: 911
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Aug. 25, 2016 01:00 PM EDT Reads: 2,724
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Aug. 25, 2016 08:45 AM EDT Reads: 2,240