Click here to close now.


Agile Computing Authors: Janakiram MSV, Jason Bloomberg, Jayaram Krishnaswamy, Joe Pruitt, Victoria Livschitz

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

The Intelligence Inside: Cloud Developers Change the World of Analytics

Evidence is mounting that embedding analytics inside apps business people use every day can lead to quantifiable benefits

Slide Deck from Karl Van den Bergh's Cloud Expo Presentation: The Intelligence Inside: How Developers of Cloud Apps Will Change the World of Analytics

We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.

The Increased Focus on Analytics
With the emphasis on data-driven decision-making, it is perhaps not a surprise that the focus on analytics continues to mount. According to IDC's Dan Vesset, 2013 was poised to be the first year that the market for data-driven decision making enabled by business analytics broke through the $100 billion mark. IT executives are also doubling-down on analytics, a fact highlighted by Gartner's annual CIO survey which has put analytics as the number one technology priority three times out of the last five years. So, given the importance and spend on analytics, everyone should have access to the insight they need, right?

Most Business People Still Don't Use Analytics
Amazingly, in spite of spending growth and focus, most information workers today do not have access to business intelligence. In fact, Cindi Howson of BI Scorecard has found that end-user adoption of BI seems to have stagnated at about 25%. This stagnation is difficult to reconcile. How is it possible that, at best, one quarter of information workers have access to what is arguably most critical to their success in a world that runs on data?

There are a variety of reasons for stagnant end-user adoption, including the high costs associated with BI projects and an overall lack of usability. However, the biggest impediment to BI adoption has nothing to do with the technology. The reality is that the vast majority of business decision makers do not spend their day working in a BI tool - nor do they want to. Users already have their preferred tool or application: sales representatives use a CRM service; marketers use a campaign management or marketing automation platform; back-office workers will spend a lot of their day in an ERP application; executives will typically work with their preferred productivity suite, and the list goes on. Unless you are a data analyst, you are not going to want to spend much of your day using a BI tool. But, just because business people prefer not to use a BI tool does not mean they don't want access to pertinent data to bolster better decision-making.

The Need for More Intelligence Inside Applications
What's the solution? Simply put, bring the data TO users inside their preferred applications instead of expecting them to go to a separate BI system to find the report, dashboard or visualization that's relevant to the question at hand. If we want to reach the other 75% of business people who don't have access to a standalone BI product, we have to inject intelligence inside the applications and services they use every day. It is only through more intelligent applications that organizations can benefit from broader data-driven decision-making. In fact, according to Gartner, BI will only become pervasive when it essentially becomes "invisible" to business people as part of the applications they use daily. In a 2013 report highlighting key emerging tech trends, Gartner concludes that in order "to make analytics more actionable and pervasively deployed, BI and analytics professionals must make analytics more invisible and transparent to their users." How? The report explains this will happen "through embedded analytic applications at the point of decision or action."

If the solution to pervasive BI is to deliver greater intelligence inside applications, why don't more applications embed analytics? The reality is that only a small fraction of applications built today have embedded intelligence. Sure, they might have a table or a chart but there is no intelligent engine; users typically can't personalize a report or dashboard or self-serve to generate new visualizations on an ad-hoc basis. The culprit here is that business intelligence was originally intended as a standalone activity, not one that was designed to be embeddable. Specifically, the reasons driving developers to ignore BI platforms boil down to cost and complexity.

Cost and Complexity Are Barriers to Embedded BI
Traditionally, BI tools have carried a user-based licensing model. Licenses typically cost from the tens of thousands to millions of dollars. Such high per-user costs might be justified for a relatively small, predictably-sized population that includes a large percentage of power users who will spend a good amount of time working with the BI tool. This user-based model, however, is totally unsuitable for the embedded use case. The embedded use case is geared toward business users who will access the BI features less frequently and likely have less analytics experience than the traditional power user - in this scenario, high per-user costs simply can't be justified.

BI products are complex on a number of different levels. First, they are complex to deploy, often requiring months if not years to roll out to any reasonable number of users. Second, they are complex to use, both for the developers building the reports and dashboards as well as the business people interacting with the tool. Third, they are complex to embed. Designed as standalone products, BI tools are not architected to plug into another application.

Given the cost and complexity of traditional standalone BI offerings, it is no surprise that developers often turn to charting libraries to deliver the visualizations within their application. The cost is low and they are relatively simple for a developer to embed. In the short term, a charting library is a reasonable solution, but over time falls flat. The demands for more charts, dashboards and reports quickly grow, and end users begin looking for the ability to self-serve and create their own visualizations. As a result of these mounting demands, many application developers find themselves essentially building a BI tool, taking them outside their core competency and stealing precious time away from advancing their own application.

Could a New Generation of Embedded BI Provide the Solution?
Fortunately, there is a new generation of embedded analytic platforms emerging that looks set to address these challenges of cost and complexity. Wayne Eckerson, a noted BI analyst, identifies this as the third generation of embedded analytics in his article on the Evolution of Embedded BI. In summary, Eckerson describes the third generation as "moving beyond the Web to the Cloud" where developers can "rent these Cloud-based BI tools by the hour." These BI platforms can "support a full range of BI functionality including data exploration and authoring" and can be embedded through standard interfaces like REST and JavaScript. So, how does this third-generation address the issues of cost and complexity?

Utility Pricing Dramatically Reduces Cost
To address the challenge of cost, a new generation of embedded analytics platforms employs a utility-based licensing model where the software is available on a per-core, per-hour or per-gigabyte basis. From a developer's perspective, this is a much fairer model, as one only pays for what is used. At the beginning of the application lifecycle when usage is sporadic, developers can limit their costs. As the application becomes successful and use grows, usage can be easily scaled up. A recent report by Nucleus Research concluded that utility pricing for analytics can save organizations up to 70% of what they would pay for a traditional BI solution. I've written previously about how utility pricing will dramatically increase the availability of analytics, reaching a much broader set of organizations. The rapid adoption of Amazon's Redshift data warehousing service and Jaspersoft's reporting and analytics service on the AWS Marketplace provides rich testimony to the benefits of this model.

Cloud and Web-Standard APIs Reduce Complexity
A cloud-based BI platform significantly simplifies deployment, as there is no BI server to install or configure. The Nucleus Research report found that the utility-priced, Cloud BI solutions could be deployed in weeks or even days as opposed to the months commonly required for a traditional BI product.

Leveraging web-standard APIs like REST and JavaScript, the third-generation platforms also simplify the task of embedding analytics both on the front-end and back-end of the application. Importantly, these APIs allow full-featured, self-service BI capabilities to be embedded, not just reports and dashboards. This means increased ability of the application to respond to the ad-hoc information requests of business users.

The Benefits of Embedded Intelligence
Intuitively, it would seem that, by providing analytics within the applications business people use every day, an organization should experience the benefits of more data-driven decision-making. But is there any proof?

A recent report by the Aberdeen Group, based on data from over 130 organizations, has helped shed light on some of the benefits of embedded analytics. First, as might be expected, those companies using embedded analytics saw 76% of users actively engaged in analytics versus only 11% for those with the lowest embedded BI adoption. As a result, 89% of the business people in these best-in-class companies were satisfied with their access to data versus only 21% in the industry laggards. The bottom line? Companies leading embedded BI adoption saw an average 19% increase in operating profit versus only 9% for the other companies.

Andre Gayle, who helps manage a voicemail service at British Telecom, illustrates the difference embedded analytics can make. "We had reports [before] but they had to be emailed to users, who had to wait for them, then dig through them as needed. It was inefficient and wasteful." Now, thanks to embedded analytics, British Telecom has seen a huge savings in time and cost. As Gayle explains, capacity planning for the voicemail service used to be a "laborious exercise, involving several days of effort to dig up the numbers " but now can be done "on demand, in a fact-based manner, in just a few minutes."

The evidence is mounting that embedding analytics inside the applications business people use every day can lead to quantifiable benefits. However, the protagonist here, unlike in the traditional world of analytics, must be the developer, not the analyst. A new generation of embedded BI platforms is making it easier and more cost effective for developers to deliver the analytical capabilities needed inside the Cloud applications they are building. As developers increasingly avail of these new platforms, we can hope that BI will finally become pervasive as an information service that informs day-to-day operations. As Wayne Eckerson puts it, "In many ways, embedded BI represents the fulfillment of BI's promise." Now it's up to Cloud developers to help us realize that promise.

More Stories By Karl Van den Bergh

Karl Van den Bergh is the Vice President of Product Strategy at Jaspersoft, where he is responsible for product strategy, product management and product marketing. Karl is a seasoned high-tech executive with 18 years experience in software, hardware, open source and SaaS businesses, both startup and established.

Prior to Jaspersoft, Karl was the Vice President of Marketing and Alliances at Kickfire, a venture-funded data warehouse appliance startup. He also spent seven years at Business Objects (now part of SAP), where he held progressively senior leadership positions in product marketing, product management, corporate development and strategy – ultimately becoming the General Manager of the Information-On-Demand business. Earlier in his career, he was responsible for EMEA marketing at ASG, one of the world’s largest privately-held software companies. Karl started his career as a software engineer.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.