Agile Computing Authors: Carmen Gonzalez, Elizabeth White, John Mertic, Pat Romanski, Liz McMillan

Related Topics: @CloudExpo, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Agile Computing, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Leveraging Your Private PaaS for Feature Delivery

Improving the product and delivering value to users

The growth of cloud services for business has been a hot topic for years now, but 2012 was the year when the cloud went from market hype to mainstream deployment. Most organizations have now adopted a private cloud of some kind, but caution is preventing them from taking full advantage. Exploring the potential benefits of new tools is vital if IT departments hope to see real performance gains.

Recent Gartner research highlights the importance of digital technologies for CIOs in the coming year. Gartner's Mark McDonald described the problem succinctly, "IT needs new tools if it hopes to hunt for technology-intensive innovation and harvest raised business performance from transformed IT infrastructure, operations and applications. Without change, CIOs and IT consign themselves to tending a garden of legacy assets and responsibilities."

What Is the Problem?
As it stands we are seeing widespread adoption of private clouds that essentially act as virtualized infrastructure. That's undeniably useful, but it doesn't solve the underlying business problem which is how to accelerate the delivery of features. Features come from the application. If we really want to leverage the potential power of the private cloud then we should be working toward a setup that supports fast, cost-efficient and error-free changes to the application layer.

What we're talking about here is adding value by rolling out features internally and externally at a much quicker clip without jeopardizing the end quality. We need to go beyond infrastructure to private PaaS.

Why PaaS May Fall Short
There are quite a few PaaS solutions on the market, but many of them are not suitable for today's enterprise and there are a number of reasons for that. The majority of them run fully or mainly in the public cloud, which immediately raises security concerns. They tend to support a very limited subset of middleware and database solutions, so integration is difficult. Interoperability has not been given enough weight and that can lead to serious difficulties down the line. There's a real lack of mobility for an application deployed via a typical PaaS service right now. You may find your business locked into a cloud service platform provider.

A start-up might see the value in adopting one of these PaaS solutions because it allows them to avoid capital expenditure at the outset, but what if your business already has a large datacenter? Many enterprises will want the option to use their existing setup and they'll naturally shy away from becoming reliant on a particular vendor environment.

What Is the Goal?
What we are really looking for here is the ability to deliver the benefits of PaaS with your existing middleware environment. You need a solution that supports automated, efficient, error-free application updates. You need a system that supports auto-scaling of your runtime environment. You need a system that can deliver an end-to-end insight into your running applications and their configuration.

The aim is to free your business and your development team from the cost and complexity of managing the underlying hardware and software systems that allow you to deploy your applications. When a new feature request comes in or customer feedback leads development in a new direction, private PaaS should enable you to deliver faster than ever before and with fewer errors. An automation interface that is accessible for the team is infinitely more efficient, not to mention more cost-effective.

Working Towards Automation
Once you have a private PaaS it's important to ensure that your new features are rolled out onto the platform automatically if you want to leverage the maximum benefits from the system. Integration of your development tooling and test suites will enable your company to provide frequent, automated updates of incremental feature improvements that happen automatically.

The benefits are obvious both internally and externally. Not only can you automatically scale and manage your existing functionality as required, but you can also add new functionality without fear of introducing errors. It's easy to get a clear overview of your complete application state every step of the way.

If you can roll out new features and updates in this way then your development team can remain focused on what's important - improving the product and delivering value to its users.

More Stories By Andrew Phillips

Andrew Phillips heads up product management at XebiaLabs. He is an evangelist and thought leader in the DevOps, Cloud and Continuous Delivery space. He sits on the management team and drives product direction, positioning and planning.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...