Welcome!

Agile Computing Authors: Pat Romanski, Kevin Benedict, Liz McMillan, William Schmarzo, Sematext Blog

Related Topics: @CloudExpo

@CloudExpo: Article

Cloud Computing: Creating a Generic (Internal) Cloud Architecture

Do Cloud-like architectures have to remain external to the enterprise? No.

Kenneth Oestriech's Blog

I've been taken aback lately by the tacit assumption that cloud-like (IaaS and PaaS) services have to be provided by folks like Amazon, Terremark and others. It's as if these providers do some black magic that enterprises can't touch or replicate. However, history has taught the IT industry that what starts in the external domain eventually makes its way into the enterprise, and vice-versa.

I've been taken aback lately by the tacit assumption that cloud-like (IaaS and PaaS) services have to be provided by folks like Amazon, Terremark and others. It's as if these providers do some black magic that enterprises can't touch or replicate.

However, history's taught the IT industry that what starts in the external domain eventually makes its way into the enterprise, and vice-versa. Consider Google beginning with internet search, and later offering an enterprise search appliance. Then, there's the reverse: An application, say a CRM system, leaves the enterprise to be hosted externally as SaaS, such as SalesForce.com. But even in this case, the first example then recurs -- as SalesForce.com begins providing internal Salesforce.com appliances back to its large enterprise customers!

I am simply trying to challenge the belief that cloud-like architectures have to remain external to the enterprise. They don't. I believe it's inevitable that they will soon find their way into the enterprise, and become a revolutionary paradigm of how *internal* IT infrastructure is operated and managed.

With each IT management conversation I've had, the concept that I recently put forward is becoming clearer and more inevitable. That an "internal cloud" (call it a cloud architecture or utility computing) will penetrate enterprise datacenters.

Limitations of "external" cloud computing architectures

Already, a number of authorities have pretty clearly outlined the pros and cons of using external service providers as "cloud" providers. For reference, there is the excellent "10 reasons enterprises aren't ready to trust the cloud" by Stacey Higginbotham of GigaOM, as well as a piece by Mike Walker of MSDN regarding "Challenges of moving to the cloud”. So it stands that innovation will work around these limitations, borrowing from the positive aspects of external service providers, omitting the negatives, and offering the result to IT Ops.

Is an "internal" cloud architecture possible and repeatable?

So here is my main thesis: that there are software IT management products available today (and more to come) that will operate *existing* infrastructure in a manner identical to the operation of IaaS and PaaS. Let me say that again -- you don't have to outsource to an "external" cloud provider as long as you already own legacy infrastructure that can be re-purposed for this new architecture.

This statement -- and associated enabling software technologies -- is beginning to spell the beginning of the final commoditization of compute hardware. (BTW, I find it amazing that some vendors continue to tout that their hardware is optimized for cloud computing. That is a real oxymoron)

As time passes, cloud-computing infrastructures (ok, Utility Computing architectures if you must) coupled with the trend toward architecture standardization, will continue to push the importance of specialized HW out of the picture. Hardware margins will continue to be squeezed. (BTW, you can read about the "cheap revolution" in Forbes, featuring our CEO Bill Coleman).

As the VINF blog also observed, regarding cloud-based architectures:

You can build your own cloud, and be choosy about what you give to others. Building your own cloud makes a lot of sense, it’s not always cheap but its the kind of thing you can scale up (or down..) with a bit of up-front investment, in this article I’ll look at some of the practical; and more infrastructure focused ways in which you can do so.

Your “cloud platform” is essentially an internal shared services system where you can actually and practically implement a “platform” team that operates and capacity plans for the cloud platform; they manage its availability and maintenance day-day and expansion/contraction.
Even back in February, Mike Nygard observed reasons and benefits for this trend:
Why should a company build its own cloud, instead of going to one of the providers?

On the positive side, an IT manager running a cloud can finally do real chargebacks to the business units that drive demand. Some do today, but on a larger-grained level... whole servers. With a private cloud, the IT manager could charge by the compute-hour, or by the megabit of bandwidth. He could charge for storage by the gigabyte, and with tiered rates for different availability/continuity guarantees. Even better, he could allow the business units to do the kind of self-service that I can do today with a credit card and The Planet. (OK, The Planet isn't a cloud provider, but I bet they're thinking about it. Plus, I like them.)
We are seeing the beginning of an inflection point in the way IT is managed, brought on by (1) the interest (though not yet adoption) of cloud architectures, (2) the increasing willingness to accept shared IT assets (thanks to VMware and others), and (3) the budding availability of software that allows “cloud-like” operation of existing infrastructure, but in a whole new way.

How might these "internal clouds" first be used?

Let's be real: there are precious few green-field opportunities where enterprises will simply decide to change their entire IT architecture and operations into this "internal cloud" -- i.e. implement a Utility Computing model out-of-the-gate. But there are some interesting starting points that are beginning to emerge:

  • Creating a single-service utility: by this mean that an entire service tier (such as a web farm, application server farm, etc.) moves to being managed in a "cloud" infrastructure, where resources ebb-and-flow as needed by user demand.
  • Power-managing servers: using utility computing IT management automation to control power states of machines that are temporarily idle, but NOT actually dynamically provisioning software onto servers. Firms are getting used to the idea of using policy-governed control to save on IT power consumption as they get comfortable with utility-computing principles. They can then selectively activate the dynamic provisioning features as they see fit.
  • Using utility computing management/automation to govern virtualized environments: it's clear that once firms virtualize/consolidate, they later realize that there are more objects to manage (virtual sprawl) , rather than fewer; plus, they've created "virtual silos", distinct from the non-virtualized infrastructure they own. Firms will migrate toward an automated management approach to virtualization where -- on the fly -- applications are virtualized, hosts are created, apps are deployed/scaled, failed hosts are automatically re-created, etc. etc. Essentially a services cloud.

It is inevitable that the simplicity, economics, and scalability of externally-provided "clouds" will make their way into the enterprise. The question isn't if, but when.

More Stories By Kenneth Oestreich

Ken Oestreich is VP of Product Marketing with Egenera, and has spent over 20 years developing and introducing new products into new markets. Recently, he’s been involved in bringing utility- and cloud-computing technologies to market. Previously, Ken was with Cassatt, and held a number of developer, alliance and strategy positions with Sun Microsystems.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
sajai krishnan 08/26/08 09:53:44 PM EDT

Ken
Very much on topic. In our parallel area around cloud storage we see interest in internal/private storage clouds as much as with external/public storage clouds. Bandwidth, security are clearly reasons to go with a private cloud, whereas getting offsite copies is certainly one reason to consider a public cloud. There is the additional reason that by building your own storage cloud you can tune the performance characteristics of your cloud by having, for example, beefy, hi-performing nodes for streaming or inexpensive nodes with a lot of disks for archival applications.

As for service providers - I think we will see service providers delivering the typical public service like S3, but could also provide "insourcing" services ... i.e. a service provider managing an dedicated internal cloud for Fortune100 data center in a colo model. I think AT&T's recent Synaptic Hosting is probably headed in that direction.

There are a few different ways to skin this cat in terms of implementation. The key is that the technology matures, and customers get familiar with the commodity scale-out economics, and easy management model that is at the core of this approach.

Regards,
Sajai Krishnan, CEO ParaScale

amuletc 08/25/08 08:14:58 PM EDT

By Dan D. Gutierrez
CEO of HostedDatabase.com

I really like your concept of an "internal cloud"! When my firm launched the web's first Database-as-a-Service offering in 1999, we had a sales option to create a special instance of our product for an enterprise that wanted the convenience of SaaS, but was concerned about privacy and security issues. Bringing in our service as an internal cloud solved these issues. Fast forward nearly 10 years, it is great to see this concept mentioned in this timely article.

@ThingsExpo Stories
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The IoT has the potential to create a renaissance of manufacturing in the US and elsewhere. In his session at 18th Cloud Expo, Florent Solt, CTO and chief architect of Netvibes, discussed how the expected exponential increase in the amount of data that will be processed, transported, stored, and accessed means there will be a huge demand for smart technologies to deliver it. Florent Solt is the CTO and chief architect of Netvibes. Prior to joining Netvibes in 2007, he co-founded Rift Technologi...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, provided tips on how to be successful in large scale machine learning...
What does it look like when you have access to cloud infrastructure and platform under the same roof? Let’s talk about the different layers of Technology as a Service: who cares, what runs where, and how does it all fit together. In his session at 18th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, an IBM company, spoke about the picture being painted by IBM Cloud and how the tools being crafted can help fill the gaps in your IT infrastructure.
"C2M is our digital transformation and IoT platform. We've had C2M on the market for almost three years now and it has a comprehensive set of functionalities that it brings to the market," explained Mahesh Ramu, Vice President, IoT Strategy and Operations at Plasma, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.