Click here to close now.

Welcome!

Web 2.0 Authors: Cloud Best Practices Network, Dana Gardner, Jason Bloomberg, Plutora Blog, Yakov Fain

Related Topics: Cloud Expo, SOA & WOA, Virtualization

Cloud Expo: Article

Cloud Computing and the Future of the Enterprise Data Center

The enterprise computing world is seeing a revolution last seen during the evolution and adoption of the internet

This article discusses how the enterprise data center is dissolving into a virtual computing space where all computing resources are addressed through the Internet. It discusses the evolution of cloud computing, its affect on enterprise IT strategy and the future enterprise computing in the data centers as we know it.

Introduction
The enterprise computing world is seeing a revolution last seen during the evolution and adoption of the internet.  The brick and mortar Data Center, centric to all the computing needs of an enterprise is under pressure as CIOs crunch ROI and TCO numbers. The ever increasing cost of maintenance of an Enterprise Data Center is forcing business to look at these numbers and the implementing strategic decisions to cut cost. The common refrain is the ever increasing cost of maintaining a large Data Center, heating, cooling, electricity and maintenance costs, both hardware and resources.

This ROI and cost approach has been historically focused on the hardware and the facilities.

Hardware
With ever-increasing faster and greater computing needs, business computing went with bigger machines, multi-processor, and multithread servers that could handle large application loads. These machines needed a large initial investment and an incremental cost in maintenance support. In most cases the capacity utilization of these machines was low, leading to a poor ROI. IT managers then moved their application to smaller footprint machines, and the blade system with multiple blades in the same chassis sharing a small footprint have become increasingly popular. The modular nature of the hardware and the ability to hot swap components has made this move very popular with IT managers.

Virtualization
The next step in the evolution to smaller footprint computing as well as increased utilization of the hardware was the adoption of virtualization strategies across all platforms. With the advent of virtualization in almost all areas from Wintel, to Unix to storage, a large number of virtual machines could be hosted on the same physical machine and multiple applications and users serviced from these virtual machines. This has meant that the system administrator does not have to be in close proximity to the server; he just needs to be able to access the machine over the network.

With the advent of virtualization as well as the large push to outsource and offshore application support as well as the easy and cheap access to relatively large bandwidths becoming available, moving the support of these virtualized machines to either outsourced service partners or even off shored partners has become a reality. More and more enterprises who have had long standing fruitful vendor relationships with their outsourced or off-shored partner are making a concentrated move towards remote management services of their enterprise hardware.

In all of the cases above the hardware remains to be hosted out of a brick and mortar Data Center. Large enterprises who own and manage their own data centers continue to bear the cost of operating the facility, the cost of the real estate, power, cooling and maintenance and upkeep of the back up systems, as well as the resources needed to operate the data center. The smaller enterprises have moved to co-located data center facilities that manage the facilities for a cost. All of them own and support their own hardware or at least pay for it through outsourced partners.

Cloud Computing
With advent of cloud computing and the successes of various IT functions that service their users with Applications and platforms over the internet, it is becoming clear that this method of IT service and delivery will fundamentally change the future of the Data Center relative to an enterprise. Cloud computing merges many new technologies into an amalgamated whole including web 2.0, fast and cheap bandwidth, virtualization, utility computing etc.

The North American National Institute for Standard and Technology (NIST) defines cloud computing  as " a model for enabling convenient, on demand access to a shared pool of configurable computing resources that can be rapidly provisioned" and released with minimum management.

The environment in cloud computing can be examined in many different ways; for all intents and purposes these can be divided into three major area; Infrastructure As a service, Software as a service and Platform as a service. Each of these main areas can be divided into smaller fine grained areas of interest depending on what the enterprise is interested in moving to the cloud. Some pundits have had more categories( Storage as a Service, Database as a Service, Information as a Service, Process as a Service, Application as a Service, Platform as a service, Integration as a Service, Security as a service, Management as a Service, Testing as a Service, Infrastructure as a service).

Thus all aspects of an enterprise IT can be serviced out of a cloud.

The cloud computing environment has also been divided into Private Clouds, Public Clouds, Hybrid Clouds

Public cloud or external cloud describes cloud computing in the traditional mainstream sense, whereby resources are dynamically provisioned on a fine-grained, self-service basis over the Internet, via web applications or web services from an off-site third-party provider who shares resources across multiple client enterprises and bills on a fine-grained use based utility computing basis.

Private cloud and internal cloud is a phrase that are used to describe offerings that emulate cloud computing on private networks. These products claim to "deliver some benefits of cloud computing without the pitfalls", capitalising on data security, corporate governance, and reliability concerns.

hybrid cloud as the name suggests consists of multiple internal and/or external providers delivering applications to the desktop through a ‘thin client' connection either over the internet or intranet.

Moving to the Cloud
The core steps in moving to the cloud can be divided into four successive steps.

  1. Identification
  2. Analysis and Test
  3. SP Selection
  4. Migration and Support

Identification
After a strategic decision has been made by the enterprise to move to the Cloud, the most important challenge to the enterprise is to identify what can be moved to the cloud and in what order. This includes applications, services, data storage and retrieval etc. The maturity of processes in the enterprise as well as its state of virtualization will be important in making these decisions.

Many enterprises may want to test the cloud computing environment by moving their ‘non-core' applications to the cloud. This implies that the enterprise has a very mature BCP and DR process in place so that suitable ‘non-core' applications can be identified. This holds true for Enterprise data, storage etc.

Application integration technologies and methods also impact the choice of applications to be moved to the cloud. A highly complex and integrated application and set of services may be a good candidate for a private cloud, where as stand alone applications may be good candidates for a public cloud. Same holds true for enterprise data; confidential, trade related data may be hosted in a private cloud to manage security concerns and non critical data may be hosted in a public cloud. Hence cloud types and configurations could be managed to the risk and security profile of the enterprise.

Analysis and Test
Once the applications or platforms have been selected, it is essential to test these under the cloud scenario. If this will be accessed through a private cloud, a testing sandbox that comprises of the application, storage, intranet access etc needs to be created for testing. This is a validation exercise to test performance under various and load conditions. The results of this testing should align with established SLAs with the business, else moving to the cloud can adversely affect business functions.

Target Selection
Once the data, process and service of the identified candidate applications/process/DB has been completed, the service provider selection should be based on those needs. In addition business requirement on the viability of the SP, their ‘always-on' ability, the use of tracking/preventing machine downtimes; recover from failures dynamically, and the SLAs between the enterprise and the SP need to be considered while choosing the right SP or SPs. Depending on the criticality of the Application or process and the BCP associated with it, these may be hosted by multiple SP with failover capabilities between the two sites.

Deployment
This is the last step in the process where application code, storage etc is ported/migrated to the cloud. This could involve the creation of new services and processes to support the new environment. This is an incremental process as the enterprise IT gets comfortable with the migration process and applications can be moved to the cloud in phases or batches.

Limitations
The maturity of the enterprise computing is very important as companies look at moving to the cloud. A very mature BCP and DR process as well as highly virtualized Data Center environment will be a pre-requisite before companies can make the jump. The definition of core and non-core applications and data and the choice of the initial applications to move to the cloud is also critical. Some legacy graphical applications which are multi-desktop at the user end are not good candidates to move to the cloud. Applications whose performance has been fine tuned to the extent that any change in hardware environment creates adversely affects the performance are not good initial candidates.

The Future
As more and more companies start looking at cloud computing as a viable alternative to a brick and mortar data center, the future of the Data center as we know it is expected to change drastically. As risk concerns of moving applications and data to the cloud are removed by service providers getting the Cloud Data Center audited inline with COBIT, PCI-SOX and ISO27002 to prove that the data in their data centers is safe and various environments are physically and logically separated, more and more CIOs are expected to support their Data Center environments to the cloud (up to 65% per Gartner's report). Enterprise Data Centers, NOC, etc., may be dying breed in the next decade.

More Stories By Debasish Chanda

Deb Chanda has 20 years of infrastructure and process consulting, thought leadership, client management, program and portfolio management and custom solution architecture experience at large global customers in multiple domains (manufacturing, distribution/retail/CPG, health care, and oil and gas). He has extensive domain expertise in enterprise architecture and cloud computing, data center architecture, IT Strategy and optimization, remote management services including global outsourcing, practice program and project management, and business process re-engineering.

@ThingsExpo Stories
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Cloudian, Inc., the leading provider of hybrid cloud storage solutions, today announced availability of Cloudian HyperStore 5.1 software. HyperStore 5.1 is an enhanced Amazon S3-compliant, plug-and-play hybrid cloud software solution that now features full Apache Hadoop integration. Enterprises can now transform big data into smart data by running Hadoop analytics on HyperStore software and appliances. This in-place analytics, with no need to offload data to other systems for Hadoop analyses, enables customers to derive meaningful business intelligence from their data quickly, efficiently and ...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
SYS-CON Events announced today that GENBAND, a leading developer of real time communications software solutions, has been named “Silver Sponsor” of SYS-CON's WebRTC Summit, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. The GENBAND team will be on hand to demonstrate their newest product, Kandy. Kandy is a communications Platform-as-a-Service (PaaS) that enables companies to seamlessly integrate more human communications into their Web and mobile applications - creating more engaging experiences for their customers and boosting collaboration and productiv...
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, a...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - is now accepting submissions to demo smart cars on the Expo Floor. Smart car sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem.
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
SYS-CON Events announced today that SoftLayer, an IBM company, has been named “Gold Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015 at the Javits Center in New York City, NY, and the 17th International Cloud Expo®, which will take place November 3–5, 2015 at the Santa Clara Convention Center in Santa Clara, CA. SoftLayer operates a global cloud infrastructure platform built for Internet scale. With a global footprint of data centers and network points of presence, SoftLayer provides infrastructure as a service to leading-edge customers ranging from ...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and other machines.