Welcome!

Web 2.0 Authors: Carmen Gonzalez, Trevor Parsons, Roger Strukhoff, Lori MacVittie, Travis Olague

Related Topics: Cloud Expo, Web 2.0

Cloud Expo: Article

Who Has the Industry Lead in Cloud Computing?

Amazon, Google, Microsoft Are All Pumping Cloud Computing Steroids

Stephen E. Arnold's Blog

Google has shifted from solving problems in distributed, massively parallel computing to developing next-generation cloud-centric applications. Google can, with the deployment of software, deliver global services that other companies cannot match in terms of speed of deployment, operation, and enhancement.


Cloud computing has become commonplace. Amazon has pumped steroids into the Amazon Web Services product line. Microsoft executives have been providing forecasts of a bold new service offering. Other vendors blasting off from mother earth to loftier realms include IBM, Intel, Rackspace, and other big name firms.

One of the most interesting documents I have read in months is a forthcoming technical paper from Microsoft’s Albert Greenberg, Paranta Lahiri, David Maltz, Parveen Patel, and Sudipta Sengupta. The paper is available from the ACM as document 978-1-60558-181-1/08/08. I have a hard copy in my hand, and I can’t locate a valid link to an online version. The ACM or a for fee database may help you get this document. In a nutshell, “Towards a Next Generation Data Center Architecture: Scalability and Commoditization” explains some of the technical innovations Microsoft is implementing to handle cloud-based, high-demand, high-availability applications. Some of the information in the paper surprised me. The innovations provide a good indication of the problems Microsoft faced in its older, pre-2008 data centers. It was clear to me that Microsoft is making progress, and some of the methods echo actions Google took as long ago as 1998.

What put the Amazon and Microsoft cloud innovations into sharp relief for me was US2008/0262828 “Encoding and Adaptive Scalable Accessing of Distributed Models.” You can download a copy of this document from the easy-to-use USPTO system. Start here to obtain the full text and diagrams for this patent application. Keep in mind that a patent application does not mean that Google has or will implement the systems and methods disclosed. What the patent application provides is a peep hole through which we can look at some of the thinking that Google is doing with regard to a particular technical issue. The peep hole may be small, but what I saw when I read the document and reviewed the drawings last night (October 24, 2008) sparked my thinking.

Before offering my opinion, let’s look at the abstract for this invention, filed in February 2006 in a provisional application. Keep in mind that we are looking in the rear view mirror here, not at where Google might be today. This historical benchmark is significant when you compare what Amazon and Microsoft are doing to deal with the cloud computing revolution that is gaining momentum. Here’s Google’s summary of the invention:

Systems, methods, and apparatus for accessing distributed models in automated machine processing, including using large language models in machine translation, speech recognition and other applications.

In typical Google style, there’s a certain economy to the description of an invention involving such technical luminaries as Jeff Dean and 12 other Googlers. The focus of the invention is on-the-fly machine translation. However, the inventors make it clear that the precepts of this invention can be applied to other applications as well. As you may know, Google has expanded its online translation capability in the last few months. If you have not explored this service, navigate to http://translate.google.com and try out the system.

The claims for this patent document are somewhat more specific. I can’t run through the 91 claims in this patent document. I can highlight one, and I will leave review of the other 90 to you. Claim 5 asserted:

The system of claim 4, wherein: the translation server comprises: a plurality of segment translation servers each operable to communicate with the translation model server, the language model servers and replica servers, each segment translation server operable to translate one segment of the source text into the target language, a translation front end to receive the source text and to divide the source text into a plurality of segments in the source language, and a load balancing module in communication with the translation front end to receive the segments of the source text and operable to distribute the segments to the segments to the segment translation servers for translation based on work load at the segment translation servers, the load balancing module further operable to direct translated segments in the target language from the segment translation servers to the translation front end.

The claim makes reasonably clear the basic nesting architecture of Google’s architecture. What impressed me is that this patent document, like other recent Google applications, makes use of an infrastructure as platform. The computational and input output tasks are simply not an issue. Google pretty clearly feels it has the horsepower to handle ad hoc translation in real time without worrying about how data are shoved around within the system. As a result, higher order applications that were impossible even for certain large government agencies can be made available without much foot dragging. I find this remarkable.

This patent document, if Google is doing what the inventors appear to be saying, is significantly different from the innovations I just mentioned from such competitors as Amazon and Microsoft. Google in my opinion is making it clear that it has a multi-year lead in cloud computing.

The thoughts that I noted as I worked thorough the 38 pages of small print in this patent document were:

  1. Google has shifted from solving problems in distributed, massively parallel computing to developing next-generation cloud-centric applications. Machine translation in real time for a global audience for free means heavy demand. This invention essentially said to me, “No problem.”
  2. Google’s infrastructure will become more capable as Google deploys new CPUs and faster storage devices. Google, therefore, can use its commodity approach to hardware and experience significant performance gains without spending for exotic gizmos or try to hack around bottlenecks such as those identified in the Microsoft paper referenced above.
  3. Google can, with the deployment of software, deliver global services that other companies cannot match in terms of speed of deployment, operation, and enhancement.

I may be wrong and I often am but I think Google is not content with its present lead over its rivals. I think this patent document is an indication that Google can put its foot on the gas pedal at any time and operate in a dimension that other companies cannot. Do you agree? Disagree? Let me learn where I am off base. Your view is important because I am finishing a write up for Infonortics about Google and publishing. Help me think straight. I even invite Cyrus to chime in. The drawings in this patent application are among Google’s best that I have seen.

More Stories By Stephen E. Arnold

Stephen E. Arnold monitors search, content processing, text mining and related topics from his high-tech nerve center in rural Kentucky. He tries to winnow the goose feathers from the giblets. He works with colleages worldwide to make this Web log useful to those who want to go "beyond search". Contact him at sa [at] arnoldit.com. His Web site with additional information about search is arnoldit.com.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
jeffhardy 11/24/08 11:43:02 AM EST

Cloud Computing Fact and Fiction

In mid-November I participated in a session at PubCon regardint Cloud Computing. My goal was to cut through the hype and buzz talk to articulate the real potential benefits and debunk false claims. I got a lot of feedback. So much so that I wrote a follow up article:
http://www.smartertools.com/blog/archive/2008/11/20/cloud-computing-chal...

It is important that we remember what Cloud Computing is and what it isn't.

Be well,
Jeffrey J. Hardy
http://www.smartertools.com

Jeremy Geelan 10/28/08 04:45:00 AM EDT

Even though Google's maybe the elephant in the cloud, there are at least 49 others competing already in the cloud computing space including not just Amazon and Microsoft but also Akamai, Force.com, IBM, Sun, VMware and a host of others. I had a first shot at a Top Fifty list here: http://cloudcomputing.sys-con.com/node/665165

@ThingsExpo Stories
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have spoken with, or attended presentations from, utilities in the United States, South America, Asia and Europe. This session will provide a look at the CREPE drivers for SmartGrids and the solution spaces used by SmartGrids today and planned for the near future. All organizations can learn from SmartGrid’s use of Predictive Maintenance, Demand Prediction, Cloud, Big Data and Customer-facing Dashboards...
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is new, what is old, and what the future may hold.
Whether you're a startup or a 100 year old enterprise, the Internet of Things offers a variety of new capabilities for your business. IoT style solutions can help you get closer your customers, launch new product lines and take over an industry. Some companies are dipping their toes in, but many have already taken the plunge, all while dramatic new capabilities continue to emerge. In his session at Internet of @ThingsExpo, Reid Carlberg, Senior Director, Developer Evangelism at salesforce.com, to discuss real-world use cases, patterns and opportunities you can harness today.
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!
Noted IoT expert and researcher Joseph di Paolantonio (pictured below) has joined the @ThingsExpo faculty. Joseph, who describes himself as an “Independent Thinker” from DataArchon, will speak on the topic of “Smart Grids & Managing Big Utilities.” Over his career, Joseph di Paolantonio has worked in the energy, renewables, aerospace, telecommunications, and information technology industries. His expertise is in data analysis, system engineering, Bayesian statistics, data warehouses, business intelligence, data mining, predictive methods, and very large databases (VLDB). Prior to DataArchon, he served as a VP and Principal Analyst with Constellation Group. He is a member of the Boulder (Colo.) Brain Trust, an organization with a mission “to benefit the Business Intelligence and data management industry by providing pro bono exchange of information between vendors and independent analysts on new trends and technologies and to provide vendors with constructive feedback on their of...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation and integration; and visibility through intelligent business operations and big data.
There will be 50 billion Internet connected devices by 2020. Today, every manufacturer has a propriety protocol and an app. How do we securely integrate these "things" into our lives and businesses in a way that we can easily control and manage? Even better, how do we integrate these "things" so that they control and manage each other so our lives become more convenient or our businesses become more profitable and/or safe? We have heard that the best interface is no interface. In his session at Internet of @ThingsExpo, Chris Matthieu, Co-Founder & CTO at Octoblu, Inc., will discuss how these devices generate enough data to learn our behaviors and simplify/improve our lives. What if we could connect everything to everything? I'm not only talking about connecting things to things but also systems, cloud services, and people. Add in a little machine learning and artificial intelligence and now we have something interesting...
Last week, while in San Francisco, I used the Uber app and service four times. All four experiences were great, although one of the drivers stopped for 30 seconds and then left as I was walking up to the car. He must have realized I was a blogger. None the less, the next car was just a minute away and I suffered no pain. In this article, my colleague, Ved Sen, Global Head, Advisory Services Social, Mobile and Sensors at Cognizant shares his experiences and insights.
We are reaching the end of the beginning with WebRTC and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) irreversibly encoded. In his session at Internet of @ThingsExpo, Peter Dunkley, Technical Director at Acision, will look at how this identity problem can be solved and discuss ways to use existing web identities for real-time communication.
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT. Attendees will learn real-world benefits of WebRTC and explore future possibilities, as WebRTC and IoT intersect to improve customer service.
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at Internet of @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, will share some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, an Open Source Cloud Communications company that helps the shift from legacy IN/SS7 telco networks to IP-based cloud comms. An early investor in multiple start-ups, he still finds time to code for his companies and contribute to open source projects.
The Internet of Things (IoT) promises to create new business models as significant as those that were inspired by the Internet and the smartphone 20 and 10 years ago. What business, social and practical implications will this phenomenon bring? That's the subject of "Monetizing the Internet of Things: Perspectives from the Front Lines," an e-book released today and available free of charge from Aria Systems, the leading innovator in recurring revenue management.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other machines.
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at Internet of @ThingsExpo, Erik Lagerway, Co-founder of Hookflash, will walk through the shifting landscape of traditional telephone and voice services to the modern P2P RTC era of OTT cloud assisted services.
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehension and conference efficiency.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, will discuss single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example to explain some of these concepts including when to use different storage models.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace. These technological reforms have not only changed computers and smartphones, but are also changing the data processing model for all information devices. In particular, in the area known as M2M (Machine-To-Machine), there are great expectations that information with a new type of value can be produced using a variety of devices and sensors saving/sharing data via the network and through large-scale cloud-type data processing. This consortium believes that attaching a huge number of devic...