|By Rob Fox||
|April 19, 2013 02:00 PM EDT||
In recent years, IT departments have been confronted with the convergence of several highly disruptive trends that have fundamentally altered the enterprise IT landscape, particularly when it comes to how data and applications are managed. Mobility and the rise of BYOD (bring your own device), as well as the growth of social media and the electronic information it generates, have each proved transformative. But perhaps no shift has been more seismic than the adoption of cloud and SaaS-based applications led by CIOs who see the value proposition associated with outsourcing many complex IT operations.
However, integrating data across diverse SaaS applications with existing on-premise solutions has proven exceptionally challenging. To streamline this integration without slowing adoption, IT stakeholders are turning to cloud-based integration solutions that can curtail complexity and IT oversight while enabling organizations to better leverage their information capital to drive business objectives. Indeed, according to a recent report by analyst firm MarketsandMarkets, the global Cloud Brokerage Services (CSB) market is on track to grow from $1.57 billion in 2013 to $10.5 billion by 2018, a compound annual growth rate of more than 45% over the five year period.
In this article, we will provide advice to IT leaders for creating sustainable environments using hybrid integration between SaaS technologies and existing on-premise applications. We will also explore the top considerations for building out a successful cloud integration strategy that offers the scalability and flexibility to withstand fluctuations in enterprise data management needs.
Start by Asking the Right Questions
Over the past few years, "Cloud" has transformed from the buzzword of the moment - all the rage but lacking concrete definition - to an efficient, widely recognized enabler of scalable IT operations. Despite the increasing ubiquity and viability of the cloud delivery model, it's important to remember that cloud is not "IT in a box." No one cloud service provider can meet all the complex IT needs of a single organization. By and large, enterprises evaluate and onboard an array of purpose-built solutions from diverse cloud providers. As a result, the need to successfully integrate them not only with each other, but also with traditional on-premise application-to-application (A2A) and business-to-business (B2B) systems is critical. The multitude of complex integrations - A2A, B2B, and on-premise applications to SaaS/cloud applications, and cloud-to-cloud (C2C) - requires a clear-cut integration strategy.
A critical first step in developing an integration strategy is to ask and answer a few key questions, the first of which is "what problem is the integration solving?" While achieving streamlined integration between cloud-based systems like Magento, NetSuite, SAP, Ariba, and salesforce.com is one aspect of a full-fledged strategy, it's important to remember the challenge extends beyond cloud-to-cloud integration. In reality, what many people today refer to as "cloud integration" is actually hybrid integration - integration not only between cloud systems, but between cloud and on-premise applications. Determining the specific integration goal - whether it is strictly cloud-to-cloud, or a larger hybrid model - ensures the strategy scales to both immediate and long-term integration needs.
Once you consider what problem the integration will solve, it's important to consider how integration will solve the problem. As the number of systems to be integrated grows, the number of potential interface points expands exponentially, and traditional, manually driven point-to-point integration can quickly become overwhelming. Each time an individual application is altered, or a trading partner changes its specification interface, IT must review all external connections for potential impact. An upgrade cycle for a large ERP system may spawn dozens, hundreds, or even thousands of integration projects across several departments and external trading partners.
Continuing to rely on this point-to-point integration model will become untenable as cloud adds another layer of complexity to the integration landscape. In order to avert chaos, enterprises are actively leveraging integration to create an interconnected web that holistically addresses data management and integration challenges across all of these disparate systems and applications. If an integration strategy is designed with a broader goal in mind, it is much more likely that the same strategy can be leveraged not only to solve immediate integration challenges, but future demands as well.
Identifying where integration is needed and how it can benefit an organization is an important first step. But once the decision has been made to move forward, there are a few key considerations that CIOs must take into account to successfully build out a strategy with staying power.
Reading the Signs: Spotting and Addressing Complexity
Anticipating the areas in which integration complexity is most likely to arise is crucial to the development of a flexible, cost-effective integration strategy. The following are two of the usual suspects of which CIOs should be aware:
- SaaS APIs: Many cloud providers promise to deliver a simple-to-use web API, but this is rarely the reality. Specifications for many SaaS APIs can run into the dozens, if not hundreds, of pages long, and can be a major headache for internal teams unfamiliar with the nuances of integration. Moreover, APIs often evolve over time as SaaS applications evolve, generating a source of ongoing complexity.
- Data Translation: The potential for complexity, however, does not end once the APIs are successfully integrated. Translating data between different SaaS applications, as well as between SaaS and on-premise systems, can be challenging, and this translation should be factored into the complexity calculus. Data that is not properly translated will be rendered useless, and backtracking to fix the glitch can add time and expense to business-critical projects. As a general rule, a bug that costs one dollar to fix during development will cost 10 dollars to fix during quality assurance, and 100 dollars fix once in production. This backtracking approach can prove particularly brittle when new systems are added to the ecosystem.
A Long-Term Vision: Thinking Beyond the First Integration Project
Integration with cloud is often a daunting prospect, particularly for businesses just beginning to onboard cloud applications as part of their IT strategy. The immensity of a single cloud integration can produce tunnel vision for IT teams, who get so bogged down in an initial project that they fail to consider the long-term implications of the integration and how it will ultimately fit into the overarching IT architecture - a problem already amply demonstrated with the pitfalls of the point-to-point approach. However, the inevitable complexity of integrating multiple applications over time should be sufficient incentive to give any CIO pause before creating a strategy tailor-made for a single integration project.
Even though it will likely require greater upfront investment and effort, organizations must settle on a cohesive sourcing strategy for integration that meets their individual needs. There are three fundamental options for this strategy: a do-it-yourself (DIY) approach based solely on existing knowledge of on-premise software; a DIY approach using a customer-driven integration Platform-as-as-Service (iPaaS); or outsourcing integration entirely to a third-party integration brokerage provider. When determining which of these strategies to adopt, it is important to consider the following:
- First, consider the deployment timeline. As departments across the enterprise demand rapid access to new and greater functionality offered by diversifying SaaS applications, IT departments are under mounting pressure to test, procure and deploy these solutions. This is where a CSB can help speed things up based on their experience working with various customers, implementation scenarios and technologies. Even as deployment windows tighten, however, many businesses are only just beginning to build out core competency around integration. For those with the strictest timelines, the option to build out an internal integration function may have already passed, and it may become necessary to bring in a third-party integration provider. While some may initially view these external integration providers as a Band-Aid solution, working with a specialized integration broker can often be the best long-term solution, especially when it comes to cloud integration where existing IT teams may have less familiarity.
- Second, consider the cost for integration in the long term. As the complexity of cloud integration projects continues to increase, building out an internal team will require a capital investment in expert personnel and software. Although it requires greater initial investment, this relatively fixed capital expenditure may be a better use of resources for some organizations. For others, such a large capital expenditure may not be feasible or efficient. Outsourcing projects to an integration broker shifts the cost of integration as an operating expense, reducing or eliminating the up-front cost, and providing a more scalable, recurring cost-structure.
- Once these factors have been weighed, the next decision is: in-house or external? Although SaaS applications for both back-office systems and B2B processes can offer tremendous efficiencies, the coordination and integration required on the back end is no simple matter. While building out in-house integration capabilities is important for some organizations due to commercial or other business considerations, companies that choose this route must recognize it early and take a proactive approach to cultivating the expert staff and resources that will be required to effectively manage and complete integration projects. For those businesses that don't have compelling reasons to keep the integration function in-house, outsourcing may prove more efficient. Cloud Services Brokers (CSBs) have existing integration infrastructure that can be leveraged for rapid deployment, and can increase capacity on demand, offering scalability when and where it's needed most. CSBs also deliver experience and collective intelligence around integration that can offer efficiencies beyond what can be accomplished with internal resources alone.
The key criteria and requirements around data management continue to expand, and cloud integration is at the nexus of this expansion. By planning and executing a comprehensive integration strategy that can efficiently and consistently scale to the evolving integration requirements of the business - including traditional on-premise, back-office systems and cloud-based applications - IT can help ensure the long-term scalability and business success. Whether the decision is to bring integration capabilities in-house, outsource integration needs, or use some combination of both, the time to start developing a plan is now.
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
Mar. 2, 2015 09:45 AM EST Reads: 942
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...
Mar. 2, 2015 09:00 AM EST Reads: 1,315
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
Mar. 2, 2015 09:00 AM EST Reads: 2,242
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
Mar. 2, 2015 09:00 AM EST Reads: 1,370
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
Mar. 2, 2015 09:00 AM EST Reads: 1,118
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Mar. 2, 2015 05:00 AM EST Reads: 2,528
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Mar. 2, 2015 04:00 AM EST Reads: 2,950
Mar. 2, 2015 03:30 AM EST Reads: 2,546
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Mar. 2, 2015 02:00 AM EST Reads: 3,065
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
Mar. 2, 2015 12:00 AM EST Reads: 3,004
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
Mar. 1, 2015 04:00 PM EST Reads: 1,297
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Mar. 1, 2015 03:15 PM EST Reads: 1,396
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
Mar. 1, 2015 02:00 PM EST Reads: 1,373
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
Mar. 1, 2015 01:45 PM EST Reads: 1,247
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Mar. 1, 2015 12:00 PM EST Reads: 1,303
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Mar. 1, 2015 12:00 PM EST Reads: 1,941
Mar. 1, 2015 10:30 AM EST Reads: 2,634
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Mar. 1, 2015 10:00 AM EST Reads: 4,799
SYS-CON Events announced today that GENBAND, a leading developer of real time communications software solutions, has been named “Silver Sponsor” of SYS-CON's WebRTC Summit, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. The GENBAND team will be on hand to demonstrate their newest product, Kandy. Kandy is a communications Platform-as-a-Service (PaaS) that enables companies to seamlessly integrate more human communications into their Web and mobile applications - creating more engaging experiences for their customers and boosting collaboration and productiv...
Feb. 28, 2015 05:00 PM EST Reads: 1,442
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, a...
Feb. 28, 2015 03:30 PM EST Reads: 4,045