Click here to close now.



Welcome!

Agile Computing Authors: William Schmarzo, Jayaram Krishnaswamy, Sanjay Uppal, Automic Blog, Charlotte Spencer-Smith

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

How Data Virtualization Improves Business Agility – Part 3

Optimize staff, infrastructure and integration approach for maximum ROI

While the benefits derived from greater business agility are significant, costs are also an important factor to consider. This is especially true in today's extremely competitive business environment and difficult economic times.

This article, the last in a series of three articles on how data virtualization delivers business agility, focuses on resource agility.

In Parts 1 and 2, business decision agility and time-to-solution agility were addressed.

Resource Agility Is a Key Enabler of Business Agility
In the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, resource agility was identified as the third key element in an enterprise's business agility strategy, along with business decision agility and time-to-solution agility.

Data virtualization directly enables greater resource agility through superior developer productivity, lower infrastructure costs and better optimization of data integration solutions.

These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects.

Superior Developer Productivity Saves Personnel Costs
At 41% of the typical enterprise IT budget, personnel staffing expenses, including salaries, benefits and occupancy, represent the largest category of IT spending according to recently published analyst research. This spending is double that of both software and outsourcing, and two-and-a-half times that of hardware.

Not only are these staffing costs high in absolute terms, with data integration efforts often representing half the work in a typical IT development project, data integration developer productivity is critically important on a relative basis as well.

As described in Part 2 of this series, data virtualization uses a streamlined architecture and development approach. Not only does this improve time-to-solution agility, it also improves developer productivity in several ways.

  • First, data virtualization allows rapid, iterative development of views and data services. The development and deployment time savings associated with this development approach directly translate into lower staffing costs.
  • Second, the typically SQL-based views used in data virtualization are a well-understood IT paradigm. And the IDEs for building these views share common terminology and techniques with the IDEs for the most popular relational databases. The same can be said for data services and popular SOA IDEs. These factors make data virtualization easy for developers to learn and reduce training costs typically required when adopting new tools.
  • Third, graphically oriented IDEs simplify data virtualization solution development with significant built-in code generation and automatic query optimization. This enables less senior and lower cost development staff to build data integration solutions.
  • Fourth, the views and services built for one application can easily be reused across other applications. This further increases productivity and reduces staffing resource costs.

Better Asset Leverage Lowers Infrastructure Costs
Large enterprises typically have hundreds, if not thousands, of data sources. While these data assets can be leveraged to provide business decision agility, these returns come at a cost. Each source needs to be efficiently operated and managed and the data effectively governed. These ongoing infrastructure costs typically dwarf initial hardware and software implementation costs.

Traditional data integration approaches, where data is consolidated in data warehouses or marts, add to the overall number of data sources. This necessitates not only greater up-front capital expenditures, but also increased spending for ongoing operations and management. In addition, every new copy of the data introduces an opportunity for inconsistency and lower data quality.

Protecting against these inevitable issues is a non-value-added activity that further diverts critical resources. Finally, more sources equal more complexity. This means large, ongoing investments in coordination and synchronization activities.

These demands consume valuable resources that can be significantly reduced through the use of data virtualization. Because data virtualization requires fewer physical data repositories than traditional data integration approaches, enterprises that use data virtualization lower their capital expenditures as well as their operating, management and governance costs. In fact, many data virtualization users find these infrastructure savings alone can justify their entire investment in data virtualization technology.

Add Data Virtualization to Optimize Your Data Integration Portfolio
As a component of a broad data integration portfolio, data virtualization joins traditional data integration approaches such as data consolidation in the form of data warehouses and marts enabled by ETL as well as messaging and replication-based approaches that move data from one location to another.

Each of these approaches has strengths and limitations when addressing various business information needs, data source and consumer technologies, time-to-solution and resource agility requirements.

For example, a data warehouse approach to integration is often deployed when analyzing historical time-series data across multiple dimensions. Data virtualization is typically adopted to support one or more of the five popular data virtualization usage patterns:

  • BI data federation
  • Data warehouse extension
  • Enterprise data virtualization layer
  • Big data integration
  • Cloud data integration

Given the many information needs, integration challenges, and business agility objectives organizations have to juggle, each data integration approach added to the portfolio improves the organization's data integration flexibility and thus optimizes the ability to deliver effective data integration solutions.

With data virtualization in the integration portfolio, the organization can optimally mix and match physical and virtual integration methods based on the distinct requirements of a specific application's information needs, source data characteristics and other critical factors such as time-to-solution, data latency and total cost of ownership.

In addition, data virtualization provides the opportunity to refactor and optimize data models that are distributed across multiple applications and consolidated stores. For example, many enterprises use their BI tool's semantic layer and/or data warehouse schema to manage data definitions and models. Data virtualization provides the option to centralize this key functionality in the data virtualization layer. This can be especially useful in cases where the enterprise has several BI tools and/or multiple warehouses and marts, each with their own schemas and governance.

Conclusion
Data virtualization's streamlined architecture and development approach significantly improves developer productivity. Further, data virtualization requires fewer physical data repositories than traditional data integration approaches. This means that data virtualization users lower their capital expenditures as well as their operating, management and governance costs. Finally, adding data virtualization to the integration portfolio enables the optimization of physical and virtual integration methods.

These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects in the pursuit of business agility.

•   •   •

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Fortunately, meaningful and tangible business cases for IoT are plentiful in a broad array of industries and vertical markets. These range from simple warranty cost reduction for capital intensive assets, to minimizing downtime for vital business tools, to creating feedback loops improving product design, to improving and enhancing enterprise customer experiences. All of these business cases, which will be briefly explored in this session, hinge on cost effectively extracting relevant data from ...
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry's single source for the cloud. Fusion's advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including clou...
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Learn how IoT, cloud, social networks and last but not least, humans, can be integrated into a seamless integration of cooperative organisms both cybernetic and biological. This has been enabled by recent advances in IoT device capabilities, messaging frameworks, presence and collaboration services, where devices can share information and make independent and human assisted decisions based upon social status from other entities. In his session at @ThingsExpo, Michael Heydt, founder of Seamless...
The IoT's basic concept of collecting data from as many sources possible to drive better decision making, create process innovation and realize additional revenue has been in use at large enterprises with deep pockets for decades. So what has changed? In his session at @ThingsExpo, Prasanna Sivaramakrishnan, Solutions Architect at Red Hat, discussed the impact commodity hardware, ubiquitous connectivity, and innovations in open source software are having on the connected universe of people, thi...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, showed how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants received the download information, scripts, and complete end-t...
For manufacturers, the Internet of Things (IoT) represents a jumping-off point for innovation, jobs, and revenue creation. But to adequately seize the opportunity, manufacturers must design devices that are interconnected, can continually sense their environment and process huge amounts of data. As a first step, manufacturers must embrace a new product development ecosystem in order to support these products.
Manufacturing connected IoT versions of traditional products requires more than multiple deep technology skills. It also requires a shift in mindset, to realize that connected, sensor-enabled “things” act more like services than what we usually think of as products. In his session at @ThingsExpo, David Friedman, CEO and co-founder of Ayla Networks, discussed how when sensors start generating detailed real-world data about products and how they’re being used, smart manufacturers can use the dat...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...