Welcome!

Agile Computing Authors: Pat Romanski, PagerDuty Blog, ManageEngine IT Matters, Elizabeth White, David H Deans

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Article

Cloud Computing and the Future of the Enterprise Data Center

The enterprise computing world is seeing a revolution last seen during the evolution and adoption of the internet

This article discusses how the enterprise data center is dissolving into a virtual computing space where all computing resources are addressed through the Internet. It discusses the evolution of cloud computing, its affect on enterprise IT strategy and the future enterprise computing in the data centers as we know it.

Introduction
The enterprise computing world is seeing a revolution last seen during the evolution and adoption of the internet.  The brick and mortar Data Center, centric to all the computing needs of an enterprise is under pressure as CIOs crunch ROI and TCO numbers. The ever increasing cost of maintenance of an Enterprise Data Center is forcing business to look at these numbers and the implementing strategic decisions to cut cost. The common refrain is the ever increasing cost of maintaining a large Data Center, heating, cooling, electricity and maintenance costs, both hardware and resources.

This ROI and cost approach has been historically focused on the hardware and the facilities.

Hardware
With ever-increasing faster and greater computing needs, business computing went with bigger machines, multi-processor, and multithread servers that could handle large application loads. These machines needed a large initial investment and an incremental cost in maintenance support. In most cases the capacity utilization of these machines was low, leading to a poor ROI. IT managers then moved their application to smaller footprint machines, and the blade system with multiple blades in the same chassis sharing a small footprint have become increasingly popular. The modular nature of the hardware and the ability to hot swap components has made this move very popular with IT managers.

Virtualization
The next step in the evolution to smaller footprint computing as well as increased utilization of the hardware was the adoption of virtualization strategies across all platforms. With the advent of virtualization in almost all areas from Wintel, to Unix to storage, a large number of virtual machines could be hosted on the same physical machine and multiple applications and users serviced from these virtual machines. This has meant that the system administrator does not have to be in close proximity to the server; he just needs to be able to access the machine over the network.

With the advent of virtualization as well as the large push to outsource and offshore application support as well as the easy and cheap access to relatively large bandwidths becoming available, moving the support of these virtualized machines to either outsourced service partners or even off shored partners has become a reality. More and more enterprises who have had long standing fruitful vendor relationships with their outsourced or off-shored partner are making a concentrated move towards remote management services of their enterprise hardware.

In all of the cases above the hardware remains to be hosted out of a brick and mortar Data Center. Large enterprises who own and manage their own data centers continue to bear the cost of operating the facility, the cost of the real estate, power, cooling and maintenance and upkeep of the back up systems, as well as the resources needed to operate the data center. The smaller enterprises have moved to co-located data center facilities that manage the facilities for a cost. All of them own and support their own hardware or at least pay for it through outsourced partners.

Cloud Computing
With advent of cloud computing and the successes of various IT functions that service their users with Applications and platforms over the internet, it is becoming clear that this method of IT service and delivery will fundamentally change the future of the Data Center relative to an enterprise. Cloud computing merges many new technologies into an amalgamated whole including web 2.0, fast and cheap bandwidth, virtualization, utility computing etc.

The North American National Institute for Standard and Technology (NIST) defines cloud computing  as " a model for enabling convenient, on demand access to a shared pool of configurable computing resources that can be rapidly provisioned" and released with minimum management.

The environment in cloud computing can be examined in many different ways; for all intents and purposes these can be divided into three major area; Infrastructure As a service, Software as a service and Platform as a service. Each of these main areas can be divided into smaller fine grained areas of interest depending on what the enterprise is interested in moving to the cloud. Some pundits have had more categories( Storage as a Service, Database as a Service, Information as a Service, Process as a Service, Application as a Service, Platform as a service, Integration as a Service, Security as a service, Management as a Service, Testing as a Service, Infrastructure as a service).

Thus all aspects of an enterprise IT can be serviced out of a cloud.

The cloud computing environment has also been divided into Private Clouds, Public Clouds, Hybrid Clouds

Public cloud or external cloud describes cloud computing in the traditional mainstream sense, whereby resources are dynamically provisioned on a fine-grained, self-service basis over the Internet, via web applications or web services from an off-site third-party provider who shares resources across multiple client enterprises and bills on a fine-grained use based utility computing basis.

Private cloud and internal cloud is a phrase that are used to describe offerings that emulate cloud computing on private networks. These products claim to "deliver some benefits of cloud computing without the pitfalls", capitalising on data security, corporate governance, and reliability concerns.

hybrid cloud as the name suggests consists of multiple internal and/or external providers delivering applications to the desktop through a ‘thin client' connection either over the internet or intranet.

Moving to the Cloud
The core steps in moving to the cloud can be divided into four successive steps.

  1. Identification
  2. Analysis and Test
  3. SP Selection
  4. Migration and Support

Identification
After a strategic decision has been made by the enterprise to move to the Cloud, the most important challenge to the enterprise is to identify what can be moved to the cloud and in what order. This includes applications, services, data storage and retrieval etc. The maturity of processes in the enterprise as well as its state of virtualization will be important in making these decisions.

Many enterprises may want to test the cloud computing environment by moving their ‘non-core' applications to the cloud. This implies that the enterprise has a very mature BCP and DR process in place so that suitable ‘non-core' applications can be identified. This holds true for Enterprise data, storage etc.

Application integration technologies and methods also impact the choice of applications to be moved to the cloud. A highly complex and integrated application and set of services may be a good candidate for a private cloud, where as stand alone applications may be good candidates for a public cloud. Same holds true for enterprise data; confidential, trade related data may be hosted in a private cloud to manage security concerns and non critical data may be hosted in a public cloud. Hence cloud types and configurations could be managed to the risk and security profile of the enterprise.

Analysis and Test
Once the applications or platforms have been selected, it is essential to test these under the cloud scenario. If this will be accessed through a private cloud, a testing sandbox that comprises of the application, storage, intranet access etc needs to be created for testing. This is a validation exercise to test performance under various and load conditions. The results of this testing should align with established SLAs with the business, else moving to the cloud can adversely affect business functions.

Target Selection
Once the data, process and service of the identified candidate applications/process/DB has been completed, the service provider selection should be based on those needs. In addition business requirement on the viability of the SP, their ‘always-on' ability, the use of tracking/preventing machine downtimes; recover from failures dynamically, and the SLAs between the enterprise and the SP need to be considered while choosing the right SP or SPs. Depending on the criticality of the Application or process and the BCP associated with it, these may be hosted by multiple SP with failover capabilities between the two sites.

Deployment
This is the last step in the process where application code, storage etc is ported/migrated to the cloud. This could involve the creation of new services and processes to support the new environment. This is an incremental process as the enterprise IT gets comfortable with the migration process and applications can be moved to the cloud in phases or batches.

Limitations
The maturity of the enterprise computing is very important as companies look at moving to the cloud. A very mature BCP and DR process as well as highly virtualized Data Center environment will be a pre-requisite before companies can make the jump. The definition of core and non-core applications and data and the choice of the initial applications to move to the cloud is also critical. Some legacy graphical applications which are multi-desktop at the user end are not good candidates to move to the cloud. Applications whose performance has been fine tuned to the extent that any change in hardware environment creates adversely affects the performance are not good initial candidates.

The Future
As more and more companies start looking at cloud computing as a viable alternative to a brick and mortar data center, the future of the Data center as we know it is expected to change drastically. As risk concerns of moving applications and data to the cloud are removed by service providers getting the Cloud Data Center audited inline with COBIT, PCI-SOX and ISO27002 to prove that the data in their data centers is safe and various environments are physically and logically separated, more and more CIOs are expected to support their Data Center environments to the cloud (up to 65% per Gartner's report). Enterprise Data Centers, NOC, etc., may be dying breed in the next decade.

More Stories By Debasish Chanda

Deb Chanda has 20 years of infrastructure and process consulting, thought leadership, client management, program and portfolio management and custom solution architecture experience at large global customers in multiple domains (manufacturing, distribution/retail/CPG, health care, and oil and gas). He has extensive domain expertise in enterprise architecture and cloud computing, data center architecture, IT Strategy and optimization, remote management services including global outsourcing, practice program and project management, and business process re-engineering.

@ThingsExpo Stories
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Akana has announced the availability of version 8 of its API Management solution. The Akana Platform provides an end-to-end API Management solution for designing, implementing, securing, managing, monitoring, and publishing APIs. It is available as a SaaS platform, on-premises, and as a hybrid deployment. Version 8 introduces a lot of new functionality, all aimed at offering customers the richest API Management capabilities in a way that is easier than ever for API and app developers to use.