Welcome!

Agile Computing Authors: James Carlini, Liz McMillan, APM Blog, Progress Blog, LeanTaaS Blog

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

How Data Virtualization Improves Business Agility – Part 3

Optimize staff, infrastructure and integration approach for maximum ROI

While the benefits derived from greater business agility are significant, costs are also an important factor to consider. This is especially true in today's extremely competitive business environment and difficult economic times.

This article, the last in a series of three articles on how data virtualization delivers business agility, focuses on resource agility.

In Parts 1 and 2, business decision agility and time-to-solution agility were addressed.

Resource Agility Is a Key Enabler of Business Agility
In the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, resource agility was identified as the third key element in an enterprise's business agility strategy, along with business decision agility and time-to-solution agility.

Data virtualization directly enables greater resource agility through superior developer productivity, lower infrastructure costs and better optimization of data integration solutions.

These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects.

Superior Developer Productivity Saves Personnel Costs
At 41% of the typical enterprise IT budget, personnel staffing expenses, including salaries, benefits and occupancy, represent the largest category of IT spending according to recently published analyst research. This spending is double that of both software and outsourcing, and two-and-a-half times that of hardware.

Not only are these staffing costs high in absolute terms, with data integration efforts often representing half the work in a typical IT development project, data integration developer productivity is critically important on a relative basis as well.

As described in Part 2 of this series, data virtualization uses a streamlined architecture and development approach. Not only does this improve time-to-solution agility, it also improves developer productivity in several ways.

  • First, data virtualization allows rapid, iterative development of views and data services. The development and deployment time savings associated with this development approach directly translate into lower staffing costs.
  • Second, the typically SQL-based views used in data virtualization are a well-understood IT paradigm. And the IDEs for building these views share common terminology and techniques with the IDEs for the most popular relational databases. The same can be said for data services and popular SOA IDEs. These factors make data virtualization easy for developers to learn and reduce training costs typically required when adopting new tools.
  • Third, graphically oriented IDEs simplify data virtualization solution development with significant built-in code generation and automatic query optimization. This enables less senior and lower cost development staff to build data integration solutions.
  • Fourth, the views and services built for one application can easily be reused across other applications. This further increases productivity and reduces staffing resource costs.

Better Asset Leverage Lowers Infrastructure Costs
Large enterprises typically have hundreds, if not thousands, of data sources. While these data assets can be leveraged to provide business decision agility, these returns come at a cost. Each source needs to be efficiently operated and managed and the data effectively governed. These ongoing infrastructure costs typically dwarf initial hardware and software implementation costs.

Traditional data integration approaches, where data is consolidated in data warehouses or marts, add to the overall number of data sources. This necessitates not only greater up-front capital expenditures, but also increased spending for ongoing operations and management. In addition, every new copy of the data introduces an opportunity for inconsistency and lower data quality.

Protecting against these inevitable issues is a non-value-added activity that further diverts critical resources. Finally, more sources equal more complexity. This means large, ongoing investments in coordination and synchronization activities.

These demands consume valuable resources that can be significantly reduced through the use of data virtualization. Because data virtualization requires fewer physical data repositories than traditional data integration approaches, enterprises that use data virtualization lower their capital expenditures as well as their operating, management and governance costs. In fact, many data virtualization users find these infrastructure savings alone can justify their entire investment in data virtualization technology.

Add Data Virtualization to Optimize Your Data Integration Portfolio
As a component of a broad data integration portfolio, data virtualization joins traditional data integration approaches such as data consolidation in the form of data warehouses and marts enabled by ETL as well as messaging and replication-based approaches that move data from one location to another.

Each of these approaches has strengths and limitations when addressing various business information needs, data source and consumer technologies, time-to-solution and resource agility requirements.

For example, a data warehouse approach to integration is often deployed when analyzing historical time-series data across multiple dimensions. Data virtualization is typically adopted to support one or more of the five popular data virtualization usage patterns:

  • BI data federation
  • Data warehouse extension
  • Enterprise data virtualization layer
  • Big data integration
  • Cloud data integration

Given the many information needs, integration challenges, and business agility objectives organizations have to juggle, each data integration approach added to the portfolio improves the organization's data integration flexibility and thus optimizes the ability to deliver effective data integration solutions.

With data virtualization in the integration portfolio, the organization can optimally mix and match physical and virtual integration methods based on the distinct requirements of a specific application's information needs, source data characteristics and other critical factors such as time-to-solution, data latency and total cost of ownership.

In addition, data virtualization provides the opportunity to refactor and optimize data models that are distributed across multiple applications and consolidated stores. For example, many enterprises use their BI tool's semantic layer and/or data warehouse schema to manage data definitions and models. Data virtualization provides the option to centralize this key functionality in the data virtualization layer. This can be especially useful in cases where the enterprise has several BI tools and/or multiple warehouses and marts, each with their own schemas and governance.

Conclusion
Data virtualization's streamlined architecture and development approach significantly improves developer productivity. Further, data virtualization requires fewer physical data repositories than traditional data integration approaches. This means that data virtualization users lower their capital expenditures as well as their operating, management and governance costs. Finally, adding data virtualization to the integration portfolio enables the optimization of physical and virtual integration methods.

These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects in the pursuit of business agility.

•   •   •

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...