Welcome!

Agile Computing Authors: Liz McMillan, Jonathan Fries, Elizabeth White, Craig Lowell, Pat Romanski

Related Topics: Apache, Microservices Expo, Agile Computing, @CloudExpo

Apache: Article

The Answer Is the Cloud – Now What’s the Question?

Cloud Computing represents the new way for businesses to re-connect with their customers

In Lewis Carroll's classic story "Through the Looking Glass," Humpty Dumpty remarked: "When I use a word, it means just what I choose it to mean - neither more nor less." It seems that the same principle applies to almost any industry expert and IT vendor when they talk about Cloud Computing. So, in an effort not to fall into the same trap as Humpty Dumpty, let's start with the obvious first question:

What exactly is Cloud Computing?
The most authoritative definition is from the National Institute of Science and Technology (NIST), the U.S. federal technology agency that works with industry to develop and apply technology, measurements, and standards. The latest version of the "NIST Definition of Cloud Computing" is available online, but it can be summarized as shown in Figure 1.

What's driving Cloud Computing today?
As the world gradually pulls itself out of recession, companies are starting to implement "return to growth" strategies, which means growing revenue rather than trying to cut their way to profitability. An essential part of this is to re-connect with customers by aligning their marketing and sales channels with the ways that their customers want to evaluate and purchase their products. As they do this, an undeniable truth emerges: The way that customers expect to interact with them has undergone a fundamental and permanent change, and on a scale and pace that has never been seen before. Businesses find a whole new generation of customers who are impatient, unencumbered with antiquated notions such as brand loyalty, and who expect things to work the way that they want them work. Customers now demand speed, immediacy, and ease-of-use. They expect to be able to do business with you wherever and whenever they want, and on whatever device they choose. The new customer experience benchmarks are Facebook, YouTube and iTunes, and if you can't provide that quality of experience, they'll simply find someone who can.

So when businesses turn their attention from survival mode to growth mode, they quickly realize that "reconnecting with the customer" is not a return to business as usual, but something that requires a complete rethink of the way they work, both externally and internally.

The new business imperatives for customer interaction are agility to meet rapidly changing market conditions, flexibility in the way that they do business and rapid time-to-value as trends are increasingly measured in days and weeks - not months and years. Inside organizations, new tools and business processes are needed to manage new ways to create demand, manage new distribution channels, communicate value to customers and provide visibility on rapidly changing customer trends.

Given the huge amount of publicity, it is inevitable that the CEO will hear or read the pitch that "Cloud provides agility, flexibility, and quicker time-to-value," and get hooked. What's keeping them awake at night is the need for a fundamental change in the way that they interact with customers, and the answer is right there in front of them - Cloud Computing. It's exactly what they need to immediately start challenging the IT department to develop a cloud strategy. As proof of this, a recent survey conducted by the 451 Group in June 2010 confirms that it is CEOs, not CIOs, who are driving Cloud Computing initiatives in most organizations.

The answer to the question - what's driving cloud computing - is very clear: Business Needs. Led by the CEO, the primary driver of cloud computing inside most organizations today is the line of business, where it's seen as an essential component of a "return to growth" strategy. Reducing cost, the primary focus for the last few years, is still important, but it's no longer one of the top priority items in an increasing number of company budgets today.

What does this mean to IT?
IT's traditional reaction to pressures from the business and customers, especially in larger enterprises, is to comprehend new requirements in the rolling three-year or five-year strategic IT plan. After all, building a new sales force automation or customer relationship management solution takes time - there are the RFI and RFQ processes to go through, detailed ROI calculations, budget approval cycles and extensive/detailed vendor contract negotiations. Once that's all done, the lengthy implementation phase can begin, where the chosen solution is customized (sometimes extensively) to fit the company's systems.

For most businesses, this process is a frustrating "take it or leave it" approach driven by IT, executed at IT's pace, and riddled with delays and cost overruns. What's more, it's completely inconsistent with the customer-facing and internally facing imperatives that the CEO and business leaders are now grappling with, in a fashion that IT cannot continue to operate in.

Cloud Computing offers a compelling alternative to the old way of providing IT services. Instead of internally developed monolithic systems, with lengthy and costly implementations of customized third-party business solutions, Cloud Computing provides an agile and flexible environment with shorter solution implementation cycles at a much lower cost. It represents a fundamental shift in the way that enterprises acquire and implement new IT functionality (computing power, storage, software, etc.) to support customer and organizational needs. In short, Cloud Computing offers IT a new way of implementing the functionality that the business units are demanding, and at a speed and cost that meets their expectations.

What this means to IT is that they are facing a critical choice that has to be made soon - either "Do nothing" or "Lead from the front." If they do nothing, business units have a choice now and they'll turn to any of the hundreds of SaaS vendors that can deliver 95 percent of the new functionality they need. These "fly under the corporate IT radar" solutions can be delivered as fast as it takes them to enter their credit card information, so they can have a great Salesforce Automation solution today with no commitment, no delay, and no IT.

"Leading from the front" is the only right course. IT owns IT, regardless of whether it comes from inside or outside the organization. Along with delivering completely new applications to the business, Cloud Computing will allow IT to enhance the functionality of existing applications by leveraging content and services from third-party providers. These "borderless" applications offer a best-of-both-worlds approach - the existing investments in legacy applications and the "systems of record" are protected, and new functionality to meet new needs can be delivered quickly and at a low initial cost.

What are the risks, and how can IT mitigate them?
From a line-of-business perspective, Cloud Computing is raising expectations on how quickly and cost-effectively new IT functionality can be made available to them. More important, even though the delivery chain for these "borderless applications" now crosses organizational and geographic boundaries, users will still expect the applications to perform well, and will hold IT accountable if they don't.

The bottom line is that IT has to meet the business' expectation of faster delivery of new functionality and good performance, while at the same time addressing two key risks: ensuring that sensitive data remains protected in compliance with company policy and state/federal legislation; and maintaining end-to-end visibility and control of service performance and availability of borderless applications.

Security
For many IT organizations, data security was a "show stopper" for adopting Cloud Computing, especially for applications in public clouds, simply because there were no existing solutions that addressed the unique security issues posed by the cloud. However, a new generation of security products from industry leaders such as Symantec, McAfee (to be acquired by Intel,) and Covisint is changing the security landscape. When combined with best practices from industry analysts such as Gartner, the issues are being effectively addressed for an increasing number of companies, including those in heavily regulated industries. Security is, and always will be, a critical issue whether companies are "in the cloud" or not, but it is no longer necessarily a show stopper.

Performance and Availability
From an end-user perspective, poor performance or non-availability of an application looks exactly the same, regardless of where the problem is in the service delivery chain - the service provider, in the data center, across the network, in the enterprise or with the end user's own device - and has exactly the same productivity impact to the business. Rapid resolution of the problem requires end-to-end visibility of the entire service delivery chain to isolate and fix the problem. The problem for many organizations is that the current generation of Application Performance Management (APM) solutions in use from most vendors fails to meet that challenge because they address data center or Internet performance issues in a narrow, compartmentalized view.

Recent experience by companies who have actually implemented cloud solutions paints an interesting picture of where IT should be focusing its risk mitigation efforts to ensure that cloud delivers real business benefits. Prior to implementation, many IT departments were unconvinced that Cloud Computing would deliver the promised business agility and flexibility benefits, and believed that the big win would be cost savings. They also believed that security concerns would tower above everything else as the number one unresolved problem, and that application performance and service level management problems would be solved by simply extending the capabilities of their existing APM solutions. Practical experience was quite different. Agility and flexibility turned out to be the number one win by a long shot; and performance and availability turned out to be tough problems that couldn't be effectively solved with their existing or planned APM solutions.

What's so hard about managing performance in the cloud?
Service providers are typically unwilling to commit to specific service level agreements; and for those that do, there is a lot of inconsistency - and confusion - in their definitions of performance and availability. Amazon for instance currently quotes availability in terms of "outages" - periods of five minutes or more during the service year in which Amazon EC2 was in the state of "region unavailable." Others prefer to quote more general statistics such as "multiple redundant gigabit Internet connections" and "greater than 99.95% service availability." To put these figures into context, a 99.95% availability means that unplanned downtime of a cloud-based service will average no more than 12 minutes per month. Compare this with about 95 minutes per month of downtime for the average exchange server, and the initial reaction is that there's no need to worry about performance and availability. However, this is a very dangerous assumption, since it ignores a critically important point: the service provider is just one part of the application delivery chain.

From an end-user perspective, poor performance or non-availability of an application looks exactly the same, regardless of where the problem actually is in the application delivery chain - one of the service providers, in the data center, across the network, in the enterprise, in the cloud or with the end user's own device - and has exactly the same productivity impact to the business. For example, a mortgage loan pre-approval application that utilizes cloud-based services, what happens if one of the services performs badly or is not available at all? How can the enterprise determine if it's a service provider problem, a network problem or an end-user device problem?

To further complicate things, geographic location can also have a dramatic impact on the overall performance of a cloud-based application - this is somewhat contrary to the popular belief that Internet communication is virtually instantaneous. A worse-case scenario is that "all lights are green" in the data center, but some (not all) customers are complaining about performance issues. Without detailed fault-domain information across the entire delivery chain, it is virtually impossible to isolate and fix performance and availability issues in a timely manner, before they start to impact users.

CloudSleuth Web Portal: The Compuware-sponsored CloudSleuth community web portal is designed to meet the growing need for authoritative, objective measurements of cloud service providers. It provides free access to real-time performance and availability visualizations of leading cloud providers around the world, plus other valuable data such as blogs, forums and white papers - all focused on best practices for building, deploying, and managing cloud-based applications.

To illustrate how all the components of the delivery chain can impact performance of a web-based application, Figures 5 and 6 show actual measurements from CloudSleuth, a Compuware-sponsored web portal that provides real-time visualizations of the performance and availability of cloud service providers. CloudSleuth measures performance of a simple application (no I/O- or CPU-intensive tasks) deployed anonymously at a number of cloud services providers. The Gomez Performance Network is then used to access those applications from backbone and "Last Mile" locations around the world to provide actual performance results. All the tests below use only Amazon EC2 East and West.

"Figure 5: Last Mile" Internet Service Provider (ISP) Performance: This test shows how the response time is impacted by the performance of the user's ISP (the so-called "last mile" connection).

Note that users in Wyoming are experiencing performance issues because of "last mile" connectivity problems, not because of Amazon.

Figure 6: Geography: The graphs clearly show that the farther away the user is from the application, the longer the response time.

This test also illustrates that if enterprises have a choice of service providers, it is best to choose one that is nearest to their user and/or customer base.

Figure 7: Time of Day: This test shows that the performance of cloud service providers is not constant, but can vary quite widely throughout the day. This is generally because the service provider is handling a varying load from other users on their systems.

Figure 7 also illustrates another practical point about cloud performance: The cloud theoretically provides "rapid elasticity," meaning that wide variations in load can be accommodated without significantly impacting the performance of individual applications. In reality, cloud service providers have to live by the same rules of economics as everyone else - they do not have banks of servers lying idle to cope with these peaks in demand. Although applications operate in their own "instances" at the service provider, their performance is affected by what their neighbors are doing!

Conclusion - Putting It All Together
Cloud Computing represents the new way for businesses to re-connect with their customers. It allows IT to meet the business need for agility, flexibility and time-to-value - all of these are vital to success in the new, customer-driven world where "work anywhere is what we do." But despite the increasingly proven business benefits, Cloud Computing introduces new business risks, and IT must play a leadership role in addressing those risks.

A key concern is managing the end-user experience of cloud-based applications by maintain complete visibility of the performance of these new borderless applications. Fred Smith, the founder of FedEx, once remarked: "Information about the package is as important as the package itself." He was making the case that it's not enough to provide a general statement of service quality; you must be able to present information on the particular service you are delivering to a particular customer at a particular time, regardless of where the package is. The same is true for borderless applications - these require a solution that can monitor and manage application performance regardless of physical, virtual or cloud attributes.

Traditional enterprise application performance management tools are unsuited to the task of managing this new generation of applications, because they only provide narrow, technology-centric keyhole views into the performance of specific components or processes. The only way to truly solve performance and availability problems is through a holistic view of application performance that encompasses the entire application delivery chain.

End-to-End Visibility Across The Application Delivery Chain

Reference
1. The NIST Definition of Cloud Computing: http://csrc.nist.gov/groups/SNS/cloud-computing/cloud-def-v15.doc

More Stories By Richard Stone

Richard Stone is Senior Solution Manager at Compuware, responsible for Cloud-based Application Performance Management solutions.

Prior to joining Compuware, Richard has held senior marketing and product management positions at Hewlett Packard, Compaq, plus a number of other US and European IT companies. He has extensive experience in cloud-based solutions and technologies, and has brought a number of cloud-based solutions to market: These include cross-industry solutions such as E-Mail, Web Conferencing, and Sales Force Automation; and vertical market solutions in industries such as Insurance, Retail Banking, and Telecommunications. His domain expertise also includes mobile computing, security, compliance, and high-availability solutions for all market segments (SMB, Enterprise, and key verticals such as Finance, Government, Healthcare, and Retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web ...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...