Welcome!

Web 2.0 Authors: Carmen Gonzalez, Trevor Parsons, Roger Strukhoff, Lori MacVittie, Travis Olague

Related Topics: Cloud Expo, SOA & WOA, Virtualization

Cloud Expo: Article

Cloud Computing and the Future of the Enterprise Data Center

The enterprise computing world is seeing a revolution last seen during the evolution and adoption of the internet

This article discusses how the enterprise data center is dissolving into a virtual computing space where all computing resources are addressed through the Internet. It discusses the evolution of cloud computing, its affect on enterprise IT strategy and the future enterprise computing in the data centers as we know it.

Introduction
The enterprise computing world is seeing a revolution last seen during the evolution and adoption of the internet.  The brick and mortar Data Center, centric to all the computing needs of an enterprise is under pressure as CIOs crunch ROI and TCO numbers. The ever increasing cost of maintenance of an Enterprise Data Center is forcing business to look at these numbers and the implementing strategic decisions to cut cost. The common refrain is the ever increasing cost of maintaining a large Data Center, heating, cooling, electricity and maintenance costs, both hardware and resources.

This ROI and cost approach has been historically focused on the hardware and the facilities.

Hardware
With ever-increasing faster and greater computing needs, business computing went with bigger machines, multi-processor, and multithread servers that could handle large application loads. These machines needed a large initial investment and an incremental cost in maintenance support. In most cases the capacity utilization of these machines was low, leading to a poor ROI. IT managers then moved their application to smaller footprint machines, and the blade system with multiple blades in the same chassis sharing a small footprint have become increasingly popular. The modular nature of the hardware and the ability to hot swap components has made this move very popular with IT managers.

Virtualization
The next step in the evolution to smaller footprint computing as well as increased utilization of the hardware was the adoption of virtualization strategies across all platforms. With the advent of virtualization in almost all areas from Wintel, to Unix to storage, a large number of virtual machines could be hosted on the same physical machine and multiple applications and users serviced from these virtual machines. This has meant that the system administrator does not have to be in close proximity to the server; he just needs to be able to access the machine over the network.

With the advent of virtualization as well as the large push to outsource and offshore application support as well as the easy and cheap access to relatively large bandwidths becoming available, moving the support of these virtualized machines to either outsourced service partners or even off shored partners has become a reality. More and more enterprises who have had long standing fruitful vendor relationships with their outsourced or off-shored partner are making a concentrated move towards remote management services of their enterprise hardware.

In all of the cases above the hardware remains to be hosted out of a brick and mortar Data Center. Large enterprises who own and manage their own data centers continue to bear the cost of operating the facility, the cost of the real estate, power, cooling and maintenance and upkeep of the back up systems, as well as the resources needed to operate the data center. The smaller enterprises have moved to co-located data center facilities that manage the facilities for a cost. All of them own and support their own hardware or at least pay for it through outsourced partners.

Cloud Computing
With advent of cloud computing and the successes of various IT functions that service their users with Applications and platforms over the internet, it is becoming clear that this method of IT service and delivery will fundamentally change the future of the Data Center relative to an enterprise. Cloud computing merges many new technologies into an amalgamated whole including web 2.0, fast and cheap bandwidth, virtualization, utility computing etc.

The North American National Institute for Standard and Technology (NIST) defines cloud computing  as " a model for enabling convenient, on demand access to a shared pool of configurable computing resources that can be rapidly provisioned" and released with minimum management.

The environment in cloud computing can be examined in many different ways; for all intents and purposes these can be divided into three major area; Infrastructure As a service, Software as a service and Platform as a service. Each of these main areas can be divided into smaller fine grained areas of interest depending on what the enterprise is interested in moving to the cloud. Some pundits have had more categories( Storage as a Service, Database as a Service, Information as a Service, Process as a Service, Application as a Service, Platform as a service, Integration as a Service, Security as a service, Management as a Service, Testing as a Service, Infrastructure as a service).

Thus all aspects of an enterprise IT can be serviced out of a cloud.

The cloud computing environment has also been divided into Private Clouds, Public Clouds, Hybrid Clouds

Public cloud or external cloud describes cloud computing in the traditional mainstream sense, whereby resources are dynamically provisioned on a fine-grained, self-service basis over the Internet, via web applications or web services from an off-site third-party provider who shares resources across multiple client enterprises and bills on a fine-grained use based utility computing basis.

Private cloud and internal cloud is a phrase that are used to describe offerings that emulate cloud computing on private networks. These products claim to "deliver some benefits of cloud computing without the pitfalls", capitalising on data security, corporate governance, and reliability concerns.

hybrid cloud as the name suggests consists of multiple internal and/or external providers delivering applications to the desktop through a ‘thin client' connection either over the internet or intranet.

Moving to the Cloud
The core steps in moving to the cloud can be divided into four successive steps.

  1. Identification
  2. Analysis and Test
  3. SP Selection
  4. Migration and Support

Identification
After a strategic decision has been made by the enterprise to move to the Cloud, the most important challenge to the enterprise is to identify what can be moved to the cloud and in what order. This includes applications, services, data storage and retrieval etc. The maturity of processes in the enterprise as well as its state of virtualization will be important in making these decisions.

Many enterprises may want to test the cloud computing environment by moving their ‘non-core' applications to the cloud. This implies that the enterprise has a very mature BCP and DR process in place so that suitable ‘non-core' applications can be identified. This holds true for Enterprise data, storage etc.

Application integration technologies and methods also impact the choice of applications to be moved to the cloud. A highly complex and integrated application and set of services may be a good candidate for a private cloud, where as stand alone applications may be good candidates for a public cloud. Same holds true for enterprise data; confidential, trade related data may be hosted in a private cloud to manage security concerns and non critical data may be hosted in a public cloud. Hence cloud types and configurations could be managed to the risk and security profile of the enterprise.

Analysis and Test
Once the applications or platforms have been selected, it is essential to test these under the cloud scenario. If this will be accessed through a private cloud, a testing sandbox that comprises of the application, storage, intranet access etc needs to be created for testing. This is a validation exercise to test performance under various and load conditions. The results of this testing should align with established SLAs with the business, else moving to the cloud can adversely affect business functions.

Target Selection
Once the data, process and service of the identified candidate applications/process/DB has been completed, the service provider selection should be based on those needs. In addition business requirement on the viability of the SP, their ‘always-on' ability, the use of tracking/preventing machine downtimes; recover from failures dynamically, and the SLAs between the enterprise and the SP need to be considered while choosing the right SP or SPs. Depending on the criticality of the Application or process and the BCP associated with it, these may be hosted by multiple SP with failover capabilities between the two sites.

Deployment
This is the last step in the process where application code, storage etc is ported/migrated to the cloud. This could involve the creation of new services and processes to support the new environment. This is an incremental process as the enterprise IT gets comfortable with the migration process and applications can be moved to the cloud in phases or batches.

Limitations
The maturity of the enterprise computing is very important as companies look at moving to the cloud. A very mature BCP and DR process as well as highly virtualized Data Center environment will be a pre-requisite before companies can make the jump. The definition of core and non-core applications and data and the choice of the initial applications to move to the cloud is also critical. Some legacy graphical applications which are multi-desktop at the user end are not good candidates to move to the cloud. Applications whose performance has been fine tuned to the extent that any change in hardware environment creates adversely affects the performance are not good initial candidates.

The Future
As more and more companies start looking at cloud computing as a viable alternative to a brick and mortar data center, the future of the Data center as we know it is expected to change drastically. As risk concerns of moving applications and data to the cloud are removed by service providers getting the Cloud Data Center audited inline with COBIT, PCI-SOX and ISO27002 to prove that the data in their data centers is safe and various environments are physically and logically separated, more and more CIOs are expected to support their Data Center environments to the cloud (up to 65% per Gartner's report). Enterprise Data Centers, NOC, etc., may be dying breed in the next decade.

More Stories By Debasish Chanda

Deb Chanda has 20 years of infrastructure and process consulting, thought leadership, client management, program and portfolio management and custom solution architecture experience at large global customers in multiple domains (manufacturing, distribution/retail/CPG, health care, and oil and gas). He has extensive domain expertise in enterprise architecture and cloud computing, data center architecture, IT Strategy and optimization, remote management services including global outsourcing, practice program and project management, and business process re-engineering.

@ThingsExpo Stories
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have spoken with, or attended presentations from, utilities in the United States, South America, Asia and Europe. This session will provide a look at the CREPE drivers for SmartGrids and the solution spaces used by SmartGrids today and planned for the near future. All organizations can learn from SmartGrid’s use of Predictive Maintenance, Demand Prediction, Cloud, Big Data and Customer-facing Dashboards...
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!
Noted IoT expert and researcher Joseph di Paolantonio (pictured below) has joined the @ThingsExpo faculty. Joseph, who describes himself as an “Independent Thinker” from DataArchon, will speak on the topic of “Smart Grids & Managing Big Utilities.” Over his career, Joseph di Paolantonio has worked in the energy, renewables, aerospace, telecommunications, and information technology industries. His expertise is in data analysis, system engineering, Bayesian statistics, data warehouses, business intelligence, data mining, predictive methods, and very large databases (VLDB). Prior to DataArchon, he served as a VP and Principal Analyst with Constellation Group. He is a member of the Boulder (Colo.) Brain Trust, an organization with a mission “to benefit the Business Intelligence and data management industry by providing pro bono exchange of information between vendors and independent analysts on new trends and technologies and to provide vendors with constructive feedback on their of...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation and integration; and visibility through intelligent business operations and big data.
There will be 50 billion Internet connected devices by 2020. Today, every manufacturer has a propriety protocol and an app. How do we securely integrate these "things" into our lives and businesses in a way that we can easily control and manage? Even better, how do we integrate these "things" so that they control and manage each other so our lives become more convenient or our businesses become more profitable and/or safe? We have heard that the best interface is no interface. In his session at Internet of @ThingsExpo, Chris Matthieu, Co-Founder & CTO at Octoblu, Inc., will discuss how these devices generate enough data to learn our behaviors and simplify/improve our lives. What if we could connect everything to everything? I'm not only talking about connecting things to things but also systems, cloud services, and people. Add in a little machine learning and artificial intelligence and now we have something interesting...
Last week, while in San Francisco, I used the Uber app and service four times. All four experiences were great, although one of the drivers stopped for 30 seconds and then left as I was walking up to the car. He must have realized I was a blogger. None the less, the next car was just a minute away and I suffered no pain. In this article, my colleague, Ved Sen, Global Head, Advisory Services Social, Mobile and Sensors at Cognizant shares his experiences and insights.
We are reaching the end of the beginning with WebRTC and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) irreversibly encoded. In his session at Internet of @ThingsExpo, Peter Dunkley, Technical Director at Acision, will look at how this identity problem can be solved and discuss ways to use existing web identities for real-time communication.
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at Internet of @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, will share some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, an Open Source Cloud Communications company that helps the shift from legacy IN/SS7 telco networks to IP-based cloud comms. An early investor in multiple start-ups, he still finds time to code for his companies and contribute to open source projects.
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT. Attendees will learn real-world benefits of WebRTC and explore future possibilities, as WebRTC and IoT intersect to improve customer service.
The Internet of Things (IoT) promises to create new business models as significant as those that were inspired by the Internet and the smartphone 20 and 10 years ago. What business, social and practical implications will this phenomenon bring? That's the subject of "Monetizing the Internet of Things: Perspectives from the Front Lines," an e-book released today and available free of charge from Aria Systems, the leading innovator in recurring revenue management.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other machines.
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at Internet of @ThingsExpo, Erik Lagerway, Co-founder of Hookflash, will walk through the shifting landscape of traditional telephone and voice services to the modern P2P RTC era of OTT cloud assisted services.
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehension and conference efficiency.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, will discuss single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example to explain some of these concepts including when to use different storage models.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace. These technological reforms have not only changed computers and smartphones, but are also changing the data processing model for all information devices. In particular, in the area known as M2M (Machine-To-Machine), there are great expectations that information with a new type of value can be produced using a variety of devices and sensors saving/sharing data via the network and through large-scale cloud-type data processing. This consortium believes that attaching a huge number of devic...
Innodisk is a service-driven provider of industrial embedded flash and DRAM storage products and technologies, with a focus on the enterprise, industrial, aerospace, and defense industries. Innodisk is dedicated to serving their customers and business partners. Quality is vitally important when it comes to industrial embedded flash and DRAM storage products. That’s why Innodisk manufactures all of their products in their own purpose-built memory production facility. In fact, they designed and built their production center to maximize manufacturing efficiency and guarantee the highest quality of our products.
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. Every other IT news item seems to be about IoT and its implications on the future of digital business.