Click here to close now.


Agile Computing Authors: Adrian Bridgwater, Harald Zeitlhofer, Pat Romanski, Liz McMillan, Elizabeth White

Related Topics: Containers Expo Blog, Government Cloud

Containers Expo Blog: Article

Making the Case for Data Virtualization

Hard metrics for hard times

Achieving compelling value from information technology is critical because IT is typically an enterprise or government agency's largest capital expense. Increasing business complexities and technology choices create greater demands for justification when making IT investments.

Cambridge, MA-based analyst firm Forrester Research recently reported that, "Business and government's purchases of computer and communication equipment, software, IT consulting, and integration services and IT outsourcing will decline by 3% on a global basis in 2009 when measured in U.S. dollars, then rise by 9% in 2010."

With smaller budgets, IT must validate purchases by correlating tangible business and IT returns that align with corporate strategic objectives. This validation should come early in the acquisition process as well as after the implementation to demonstrate actual value and justify expanded adoption.

Evaluating data virtualization first requires understanding how it specifically delivers value. This understanding can then be used to calculate value and provide the hard metrics required for hard economic times.

Data Virtualization
Data virtualization is used to integrate data from multiple disparate sources - anywhere across the extended enterprise - for consumption by front-end business solutions, including portals, mashups, reports, applications, and search engines (see Figure 1).

Source: Composite Software, Inc.
Figure 1 Data virtualization at a glance

As middleware technology, data virtualization or virtual data federation has advanced beyond high-performance query or enterprise information integration (EII). As IT architecture, data virtualization is implemented as a virtualized data layer, an information grid, an information fabric, or a data services layer in service-oriented architecture (SOA) environments. It can also be deployed on a project basis, for business intelligence (BI) and reporting, portals and mashups, and industry-focused single views.

Data Virtualization's Five Value Points
The many ways data virtualization delivers value to business functions and IT operations can be categorized as:

  1. Sales Growth
  2. Risk Reduction
  3. Time Savings
  4. Technology Savings
  5. Staff Savings

Converting these to hard metrics requires an understanding of the relationships between specific data virtualization capabilities and the IT and business value they deliver. Value calculations are made using one or more forms of return-on-investment (ROI) calculators. Examples of the five value points and their metrics along with actual customer case studies are provided below.

Sales Growth
As an important indicator of an enterprise's success (or, in the public sector, as an indicator of service growth or mission effectiveness), sales growth results from business strategies such as improved offerings, better customer support, and faster market response. Data virtualization supports these strategies by providing data federation, on-demand data access and delivery, and automated data discovery and modeling.

More Complete Data
Data federation capabilities enable the integration of disparate data on-the-fly without physical data consolidation, making more complete data available to revenue-producing and customer-facing staff for better sales-related business decisions. Hard metrics include:

  • The number of decisions within the project's scope
  • The revenue-enhancing value of improving each decision based on the availability of more complete data

An energy provider used data federation to increase oil production from its 10,000 wells. The data included complex surface, subsurface, and business data in high volumes from many disparate sources. The data virtualization solution federated actionable information to automate maintenance and repair decisions made throughout the day, while relieving key resources for other value-adding tasks. This increased both staff and repair rig productivity, which were key factors in the 10% increase achieved in well revenue performance and efficiency.

Fresher Data
Data virtualization's on-demand data access and delivery capabilities reach difficult-to-access data and deliver it to consuming applications in near real-time. Fresher data means more timely and accurate decision-making, often yielding sales growth. The hard metrics include:

  • The number of decisions in the project's scope
  • The revenue-enhancing value of improving each decision based on the availability of more timely data

A leading marketing information company used on-demand data access and delivery to grow sales by providing its large consumer goods clients with more timely access to its huge collection of consumer trends and demand information. The data virtualization layer enabled simplified and rapid development of the real-time queries required by the customers' self-service reporting tools. This capability was the key factor behind a 2% increase in revenue.

Quicker Time-to-Solution
Data virtualization's automated data discovery and rapid modeling capabilities reduce the time typically wasted on searching for relationships among data tables. These capabilities automate many of the detailed modeling and validation activities. With quicker time-to-solution, new sales-impacting applications and their associated revenues are available sooner. The hard metrics include:

  • The number of months the project can be delivered earlier
  • The revenue-enhancing value associated with each month where value was accelerated

An investment bank used data discovery and modeling to increase revenues by improving its trade order management, debt/equity market research, and risk management applications. The abstracted data layer in the SOA environment enabled rapid modeling and complex query creation that was shareable across the bank. The resulting 60% reduction in integration design and development time on revenue-enabling applications and portals contributed to a 2% revenue increase at the bank.

Risk Reduction
Risk reduction has become increasingly important as a result of greater complexity and regulation. Becoming more agile in response to risk, improving predictability in light of risk, and ensuring compliance with changing regulations and reporting mandates are a few of the strategies to reduce risk. Data virtualization supports these strategies through its data federation, on-demand data access and delivery, and data discovery and modeling capabilities.

These data virtualization capabilities and IT benefits are similar to those driving sales growth. However, for risk reduction, the business benefit is better risk visibility and faster problem remediation. In both cases, quicker time-to-solution helps get new or improved applications online faster. However, in the case of risk reduction, these might be applications for risk management or compliance reporting, rather than sales or customer management.

More Complete Data
Data federation provides more complete data to risk and compliance management staff, thereby improving data visibility and reducing overall risk. The hard metrics include:

  • The number of risk decisions in the project's scope
  • The risk reduction value of improving each decision based on the availability of more complete data

A global pharmaceutical company used data federation to shorten lengthy R&D cycles and reduce the risk of new product delays. Its Research Scientists' Workbench solution combined disparate structured and semi-structured research data from across the enterprise. Armed with more complete information, researchers were able to resolve problems faster, resulting in 60% fewer new product delays.

Fresher Data and Quicker Time-to-Solution
Data virtualization's on-demand data access and delivery capabilities improve the timeliness of data so risk issues can be remediated faster. Data virtualization's automated discovery and modeling accelerates new risk management and compliance reporting application development, thereby delivering their associated risk-reducing benefits sooner.

Time Savings
New information systems must deliver the data needed while reducing the latency between business event and response. So IT is under constant pressure to provide these new systems and their associated information more quickly. Strategies for saving development and deployment time as well as decreasing data latency are crucial. Data virtualization supports these strategies by providing a data services library; installation wizard, manager and clustering; and query optimization capabilities.

Less New Code, Greater Reuse
The data services library holds complete collections of reusable views and data services. By using these existing objects, the need for new coding efforts is lessened, permitting greater reuse across multiple applications. This saves project development time for both new applications and existing ones when changes are needed. The hard metrics include:

  • The number of views or services planned
  • The savings per view or service
  • The percentage reuse factor for a specified time frame

A major investment bank wanted to build new applications faster, but it couldn't because key reference data, such as counter-party accounts, was duplicated across multiple applications. Other than slowing development, this proliferation contradicted good banking practices and data governance. The bank shaved 25% off its average development time by creating a shared data services library to house Web Services for sharing counter-party master reference data.

Easy Installation and Reliable Operation
Various installation wizards along with manager and clustering capabilities accelerate and automate the installation and runtime of data virtualization solutions. As a result, new solutions are deployed faster. The hard metrics include:

  • The number of months a project can be delivered earlier
  • The assignable value associated with each month the value was accelerated

A leading life sciences R&D organization needed to quickly prototype, develop, and deploy the new information solutions required to support strategic decisions by business executives. It used data virtualization to build and deploy virtual data marts in support of multiple data consumers including Microsoft SharePoint, Business Objects Business Intelligence, TIBCO Spotfire, Microsoft Excel, and various Web portals. This resulted in a 90% reduction in the time required to deploy new information sets.

High-Performance Data Delivery
Data virtualization's query optimization and caching capabilities help eliminate data latencies, speeding the delivery of critical information to users and applications, thereby shortening the time between business events and response.

A North American telecom chip maker targeted faster responses to customer requests. To do this, its sales force management analytics required up-to-the-minute data from the packaged CRM application as well as other systems. The manufacturer used data virtualization to optimize query performance, ultimately cutting average report runtimes from four minutes to 30 seconds or less.

Technology Savings
Just as storage, server, and applications' virtualization have demonstrated huge technology savings, data virtualization has proven to provide similar savings by requiring fewer physical data repositories along with the systems required to operate and manage them. Many users find that these technology infrastructure savings alone justify the investment. This is frequently a natural place to start a data virtualization deployment.

Fewer Physical Repositories, Lower Hardware, Software, and Facilities Costs
Data virtualization doesn't require replication in intermediate physical data repositories. Fewer physical data marts and operational data stores mean less supporting hardware and software. This, in turn, means less rack space, electricity, air conditioning, management software, and other facilities' costs. The hard metrics include:

  • The number of servers reduced due to virtual federation rather than physical consolidation
  • The assignable hardware, software, and facilities cost associated with each server

A leading computer maker wanted to reduce the cost of its supply chain and customer management operational BI applications, which included more than 50 intermediate data marts. Each mart required a server, resulting in lifecycle hardware infrastructure costs of $20,000 each. It used data virtualization to provide a virtual supply chain data hub that replaced the physical data marts. This resulted in $1 million in infrastructure cost savings.

Staff Savings
The cost of internal and outsourced staff is typically the largest IT expenditure. Substituting automated tools for labor is one strategy for cutting staffing costs. Increasing existing staff productivity by improving SDLC processes is another. Simplifying the work to employ lower-skilled (and therefore less costly) staff is a third. Data virtualization supports these strategies by delivering a user-friendly GUI development environment, along with automated data discovery and data services library capabilities.

Fewer Skills Required
Data virtualization's GUI development environment simplifies and automates the detailed design and development work that would otherwise require more technically capable staff. Enterprises find they spend less money on expensive consultants. Enterprises also find that they can redeploy their highly skilled staff to other more critical work. The hard metrics include:

  • The number of consulting staff hours being reduced
  • The cost per hour

A global money manager wanted to reduce the effort required by 100 financial analysts who build the complex portfolio models used by fund managers. Its solution was to build a virtualization layer surrounding the warehouse to abstract away the complexity of the underlying data. This simplification resulted in a financial analyst productivity increase of 25%, allowing many to be redeployed to develop additional financial analytics useful to the firm.

Greater Collaboration
Discovery capabilities within today's data virtualization platforms quickly validate new reporting solutions with end users using live data early in the development process. Furthermore, analysts and data designers can transfer models to application developers and operations teams to complete the process. This encourages team collaboration and helps reduce rework. The hard metrics include:

  • The number of hours saved
  • The cost per hour

The same global money management firm cited above wanted to improve the collaboration of its 100 financial analysts. Many of its financial models relied on similar data and data models, but technology hindered these analysts from effectively sharing their work. A common virtualization layer over the financial research data warehouse provided the financial analysts with reusable data views that could be shared for the first time. In addition, IT provided a dedicated DBA and data architect who created the new views as needed. The improved collaboration resulted in higher portfolio returns and a 150% ROI in six months.

Metrics for Hard Times
By understanding the specific contributing value factors of data virtualization, C-level executives and IT managers can more easily calculate both the estimated and actual value of each data virtualization implementation under consideration, be it architecture or project. The resulting hard metrics that clearly contribute to enterprise-wide goals arm budget decision makers with the data they need to make confident decisions in hard economic times.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Most of the IoT Gateway scenarios involve collecting data from machines/processing and pushing data upstream to cloud for further analytics. The gateway hardware varies from Raspberry Pi to Industrial PCs. The document states the process of allowing deploying polyglot data pipelining software with the clear notion of supporting immutability. In his session at @ThingsExpo, Shashank Jain, a development architect for SAP Labs, discussed the objective, which is to automate the IoT deployment process from development to production scenarios using Docker containers.
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Countless business models have spawned from the IaaS industry – resell Web hosting, blogs, public cloud, and on and on. With the overwhelming amount of tools available to us, it's sometimes easy to overlook that many of them are just new skins of resources we've had for a long time. In his general session at 17th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, an IBM Company, broke down what we have to work with, discussed the benefits and pitfalls and how we can best use them to design hosted applications.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.