Welcome!

Web 2.0 Authors: Elizabeth White, Tim Hinds, Liz McMillan, Yeshim Deniz, Pat Romanski

Related Topics: Virtualization, Cloud Expo

Virtualization: Article

The Benefits of Virtualization and Cloud Computing

Whether you’re an enterprise or small to medium business, you’ll soon be benefiting from the cloud

What’s all the buzz about? Cloud computing is one of Gartner’s top 10 strategic technology trends for 2009 – #2, right behind virtualization. Analysts say the economics of cloud for customers are truly compelling, with expected savings for business applications of 3-5x. That’s not chump change – particularly in today’s recessionary economy.

But the most compelling benefits of the cloud aren't just cost-savings. They're the increased flexibility, elasticity and scalability available to optimize efficiency and best serve the needs of the business.

What is Cloud Computing
Whether you're an enterprise or small to medium business, you'll soon be benefiting from the cloud.  But what is cloud computing exactly?

Cloud computing is essentially the ability to acquire or deliver a resource on demand, configured however the users chooses, and paid for according to consumption.  From a supplier's perspective, including both internal IT groups and service providers, it means being able to deliver and manage resource pools and applications in a multi-tenancy environment, to deliver the user an on-demand, pay-per-use service. A cloud service can be infrastructure for hosting applications or data storage, a development platform, or even an application that you can get on-demand, either off-site at a provider, such as SunGard or Salesforce, or built onsite within IT.

It's important to note that while many view cloud computing as services consumed externally, innovative CIOs have taken the steps to transform their IT groups into internal service providers.  This strategic shift gives them control and accountability for usage and resources, while providing a dynamic, self service model to accommodate the needs and SLAs required by the business units.  To see how one enterprise did this, you can view their video story online at: www.vmware.com/cloud.

For those of us who remember the good old dot com days, before the bust, we saw the concept of hosted services emerge.  Everyone jumped on the ASP, ISP, MSP (application service provider, internet service provider, managed service provider, respectively) bandwagons and built offerings to deliver online services or variants thereof, such as on-demand software and software-as-a-service (SaaS).

Remembering back to the xSP days, however, we must also remember that there were issues with the services hosting model. One issue was that few were comfortable with the concept of having their information hosted outside of their immediate control, as well as the fear of being locked into a relationship with particular vendors.

So, as the new concept of the cloud emerges, many are asking how it's different this time around and what should we expect?   Unlike those previous hosting models, we see well-established companies diversifying their business models to offer new services, based on established core competencies. This fundamental difference will help shape and stabilize the new concept of the cloud.

VMware CTO Steve Herrod keynoting SYS-CON's 3rd Virtualization Conference & Expo in New York. Read an Exclusive Q&A wth Herrod here.

But even more importantly, we have seen new technologies evolve over the past decade that are essential to the notion of the cloud.  The key technology is virtualization. In addition to some amazing cost savings and goodness for the environment, virtualization's ability to separate the OS and application from the hardware give it ideal properties to best deliver these on-demand cloud services. Charles King, Principal Analyst at Pund-IT put it succinctly: "Without virtualization there is no cloud- that's what enabled the emergence of this new, sustainable industry."

Challenges of the cloud
Today, new and established vendors are vying to deliver cloud services.  The challenge for users becomes choosing the right offering.  Many of the offerings are really designed to encourage development on the vendor's proprietary platform, limiting switching abilities and propagating the offering through applications built for the external cloud only.  This is appealing to the development community as it enables quick access to infrastructure and development platforms on which to create a cloud application.  But this can become a nightmare for IT when the application has to come back into the enterprise for production-level support, as well as dealing with SOX and IP risks. The viability of this solution is potentially the unearthing of a more significant problem, the inability of IT to deliver infrastructure on demand to meet the dynamic needs of these groups.  However in many cases, unless you're building an application from scratch, most businesses don't have the time or resources to rewrite their production applications to work in the cloud on a proprietary platform.

Users should choose a cloud strategy that enables the fastest development time for new applications, with the broadest support for various OSs and development environments, as well as the ability to support production-level applications on- and off-premise as needed.

The other challenge is mobility and choice in location for running applications, internally in a private cloud or externally in a public cloud.  Another approach we see in the market is the "superstore phenomenon." Organizations such as Amazon, Microsoft and Google all plan to battle it out over whose superstore datacenter will be the place your developers will build and house their cloud applications. It is true that these are all stable brands and their infrastructure will likely be a safe place to run your applications; however, in the event of outages, downtime and the inability to access your applications, what options will you have? Additionally, how will you manage these instances, where they live long term and what risks will be imposed by keeping them off site?  Users should be able to move their applications at will from one cloud to another, whether internally or externally.

Obviously, the encapsulation offered in virtualization and the mobility found in technology like VMware VMotion - which enables a live virtual machine to be moved with no downtime for the application - increase a user's ability to move virtual machines as needed. VMware's approach to the cloud is not about vendor lock-in , but is about enabling its ecosystem of partners to build and deliver services on a common platform, allowing users to simplify the federation of clouds, on or off premise as needed, to a broad base of service providers.

Lastly, you'll want to look at innovation and stability in providing the technology to leverage your virtualization investments into internal or external cloud options. If your production environments are running on VMware and you chose that platform due to the robust innovation cycles, reliability and technology advancements offered today, you want nothing less in a cloud services provider off-premise.  Say you want to establish a relationship with a service provider to offer some flex capacity at the end of the quarter for financial reporting activities.  You'll still want the reliability of your production system, control of that environment and the ability to move your VMs when and where you want. Also, as you build your internal clouds, look for vendors that are building for the future and whatever new technologies and application infrastructures might come along, visionary vendors that are future-proofed for new trends and have proven that they can deliver technology innovation in a timely manner.

Why does virtualization matter when building or selecting cloud services/vendors?
Clearly, there's a new trend emerging with lots of options, but also many challenges that could cost big money to reverse. How does virtualization address these challenges and allow a seamless transition to a cloud strategy, either on- or off-premise?

As mentioned above the key requirements you should demand from your cloud providers are: broad application support without lock-in, ease in mobility of environments, broad choice of locations (internal or external), and innovation that drives simplified federation of on- and off-premise clouds.  Additionally, as an enterprise you'll want to look for innovation in building the internal (private) cloud to evolve your ability to offer dynamic services.

As noted, virtualization is the key. Most companies' first step on the virtualization path is to consolidate their servers, using virtualization to run multiple applications on each server instead of just one, increasing the utilization rate of (and getting more value from) every server and, thus, dramatically reducing the number of servers they need to buy, rack, power, cool, and manage.

Having consolidated servers, you realize that not only have you substantially cut the capital and operating costs of your server environments, but as a result the entire datacenter has become far more flexible. Along the way, you may have started to think about and to use IT resources - including servers, storage, networks, desktops, and applications - not as isolated silos that must be managed individually but as pools of resources that can be managed in the aggregate.

This means that you can now move resources around at will across the network, from server to server, datacenter to datacenter, and even out into the cloud, to balance loads and use compute capacity more efficiently across the entire global IT environment.

In other words, users are able to look at the compute power as a centralized resource that they can now allocate to business units on demand, while still maintaining control and operational excellence. Leveraging virtualization to better serve users gives your organization the obvious lower TCO, but also allows for accountability of usage, simplifies and meets the needs of on-demand infrastructure requests, and allows for your ability to serve, control and manage SLAs.

Hence, virtualization has played and will continue to play a huge role in cloud computing. It is the technology that has allowed service providers to deliver lower-cost hosting environments to businesses of all sizes today. Just as virtualization enabled you to consolidate your servers and do more with less hardware, it also lets you support more users per piece of hardware, and deliver applications - and the servers on which they run - faster to those users.

As the leader in virtualization, VMware recently launched its vCloud initiative.  With its proven, reliable platform deployed in over 120,000 customer environments today, VMware is committed to working with enterprises who want to build internal clouds with the ability to federate to external providers to meet the changing needs of thier business.  VMware's virtual datacenter operating system, enables internal clouds with features such as self-service provisioning, chargeback, and many other advanced automation and management features.

In addition, VMware is leveraging its huge ecosystem to bring new cloud offerings, such as security for clouds, to market.  The virtualization market leader's approach leverages the infrastructure and expertise of hundreds of partners worldwide, including brand names such as Verizon, Hosting.com, SunGard, Terremark and Savvis to deliver the VMware platform and cloud services.  This, in combination with the technology for internal clouds, lets enterprises run their applications where they want, when they want.

With the largest choice of location and interoperability of platforms, the broadest application and OS support, and leading virtualization and cloud technologies, VMware and its cloud strategy offer users a safe, reliable, and robust on-ramp to the cloud, whether on or off premise.

So, if you're a VMware user, you're in good hands and you've already taken steps toward the cloud simply by virtualizing your servers on a proven platform that offers rich management and automation features. You will see VMware continue to lead the market in delivering cloud innovation for both on- and off-premise clouds.

If you're not a VMware user but want reliable infrastructure on demand, many service providers offer VMware Infrastructure 3 with pay-per-use models.  For more information about VMware vCloud or to find a partner that can help you realize the benefits of the cloud, visit: www.vmware.com/vcloud.

More Stories By Wendy Perilli

Wendy Perilli is director of product marketing for cloud computing at VMware. She gathers market insight from analysts, customers, service and technology partners and many others. With almost 20 years in high-tech fields, Wendy's broad range of experience with various technologies offers unique insight into the role that virtualization plays in emerging markets and trends.

Comments (2)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
“With easy-to-use SDKs for Atmel’s platforms, IoT developers can now reap the benefits of realtime communication, and bypass the security pitfalls and configuration complexities that put IoT deployments at risk,” said Todd Greene, founder & CEO of PubNub. PubNub will team with Atmel at CES 2015 to launch full SDK support for Atmel’s MCU, MPU, and Wireless SoC platforms. Atmel developers now have access to PubNub’s secure Publish/Subscribe messaging with guaranteed ¼ second latencies across PubNub’s 14 global points-of-presence. PubNub delivers secure communication through firewalls, proxy ser...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.