Click here to close now.

Welcome!

Agile Computing Authors: Liz McMillan, Carmen Gonzalez, Harry Trott, Lacey Thoms, Adrian Bridgwater

Related Topics: @ContainersExpo Blog, Government Cloud

@ContainersExpo Blog: Article

Making the Case for Data Virtualization

Hard metrics for hard times

Achieving compelling value from information technology is critical because IT is typically an enterprise or government agency's largest capital expense. Increasing business complexities and technology choices create greater demands for justification when making IT investments.

Cambridge, MA-based analyst firm Forrester Research recently reported that, "Business and government's purchases of computer and communication equipment, software, IT consulting, and integration services and IT outsourcing will decline by 3% on a global basis in 2009 when measured in U.S. dollars, then rise by 9% in 2010."

With smaller budgets, IT must validate purchases by correlating tangible business and IT returns that align with corporate strategic objectives. This validation should come early in the acquisition process as well as after the implementation to demonstrate actual value and justify expanded adoption.

Evaluating data virtualization first requires understanding how it specifically delivers value. This understanding can then be used to calculate value and provide the hard metrics required for hard economic times.

Data Virtualization
Data virtualization is used to integrate data from multiple disparate sources - anywhere across the extended enterprise - for consumption by front-end business solutions, including portals, mashups, reports, applications, and search engines (see Figure 1).

Source: Composite Software, Inc.
Figure 1 Data virtualization at a glance

As middleware technology, data virtualization or virtual data federation has advanced beyond high-performance query or enterprise information integration (EII). As IT architecture, data virtualization is implemented as a virtualized data layer, an information grid, an information fabric, or a data services layer in service-oriented architecture (SOA) environments. It can also be deployed on a project basis, for business intelligence (BI) and reporting, portals and mashups, and industry-focused single views.

Data Virtualization's Five Value Points
The many ways data virtualization delivers value to business functions and IT operations can be categorized as:

  1. Sales Growth
  2. Risk Reduction
  3. Time Savings
  4. Technology Savings
  5. Staff Savings

Converting these to hard metrics requires an understanding of the relationships between specific data virtualization capabilities and the IT and business value they deliver. Value calculations are made using one or more forms of return-on-investment (ROI) calculators. Examples of the five value points and their metrics along with actual customer case studies are provided below.

Sales Growth
As an important indicator of an enterprise's success (or, in the public sector, as an indicator of service growth or mission effectiveness), sales growth results from business strategies such as improved offerings, better customer support, and faster market response. Data virtualization supports these strategies by providing data federation, on-demand data access and delivery, and automated data discovery and modeling.

More Complete Data
Data federation capabilities enable the integration of disparate data on-the-fly without physical data consolidation, making more complete data available to revenue-producing and customer-facing staff for better sales-related business decisions. Hard metrics include:

  • The number of decisions within the project's scope
  • The revenue-enhancing value of improving each decision based on the availability of more complete data

An energy provider used data federation to increase oil production from its 10,000 wells. The data included complex surface, subsurface, and business data in high volumes from many disparate sources. The data virtualization solution federated actionable information to automate maintenance and repair decisions made throughout the day, while relieving key resources for other value-adding tasks. This increased both staff and repair rig productivity, which were key factors in the 10% increase achieved in well revenue performance and efficiency.

Fresher Data
Data virtualization's on-demand data access and delivery capabilities reach difficult-to-access data and deliver it to consuming applications in near real-time. Fresher data means more timely and accurate decision-making, often yielding sales growth. The hard metrics include:

  • The number of decisions in the project's scope
  • The revenue-enhancing value of improving each decision based on the availability of more timely data

A leading marketing information company used on-demand data access and delivery to grow sales by providing its large consumer goods clients with more timely access to its huge collection of consumer trends and demand information. The data virtualization layer enabled simplified and rapid development of the real-time queries required by the customers' self-service reporting tools. This capability was the key factor behind a 2% increase in revenue.

Quicker Time-to-Solution
Data virtualization's automated data discovery and rapid modeling capabilities reduce the time typically wasted on searching for relationships among data tables. These capabilities automate many of the detailed modeling and validation activities. With quicker time-to-solution, new sales-impacting applications and their associated revenues are available sooner. The hard metrics include:

  • The number of months the project can be delivered earlier
  • The revenue-enhancing value associated with each month where value was accelerated

An investment bank used data discovery and modeling to increase revenues by improving its trade order management, debt/equity market research, and risk management applications. The abstracted data layer in the SOA environment enabled rapid modeling and complex query creation that was shareable across the bank. The resulting 60% reduction in integration design and development time on revenue-enabling applications and portals contributed to a 2% revenue increase at the bank.

Risk Reduction
Risk reduction has become increasingly important as a result of greater complexity and regulation. Becoming more agile in response to risk, improving predictability in light of risk, and ensuring compliance with changing regulations and reporting mandates are a few of the strategies to reduce risk. Data virtualization supports these strategies through its data federation, on-demand data access and delivery, and data discovery and modeling capabilities.

These data virtualization capabilities and IT benefits are similar to those driving sales growth. However, for risk reduction, the business benefit is better risk visibility and faster problem remediation. In both cases, quicker time-to-solution helps get new or improved applications online faster. However, in the case of risk reduction, these might be applications for risk management or compliance reporting, rather than sales or customer management.

More Complete Data
Data federation provides more complete data to risk and compliance management staff, thereby improving data visibility and reducing overall risk. The hard metrics include:

  • The number of risk decisions in the project's scope
  • The risk reduction value of improving each decision based on the availability of more complete data

A global pharmaceutical company used data federation to shorten lengthy R&D cycles and reduce the risk of new product delays. Its Research Scientists' Workbench solution combined disparate structured and semi-structured research data from across the enterprise. Armed with more complete information, researchers were able to resolve problems faster, resulting in 60% fewer new product delays.

Fresher Data and Quicker Time-to-Solution
Data virtualization's on-demand data access and delivery capabilities improve the timeliness of data so risk issues can be remediated faster. Data virtualization's automated discovery and modeling accelerates new risk management and compliance reporting application development, thereby delivering their associated risk-reducing benefits sooner.

Time Savings
New information systems must deliver the data needed while reducing the latency between business event and response. So IT is under constant pressure to provide these new systems and their associated information more quickly. Strategies for saving development and deployment time as well as decreasing data latency are crucial. Data virtualization supports these strategies by providing a data services library; installation wizard, manager and clustering; and query optimization capabilities.

Less New Code, Greater Reuse
The data services library holds complete collections of reusable views and data services. By using these existing objects, the need for new coding efforts is lessened, permitting greater reuse across multiple applications. This saves project development time for both new applications and existing ones when changes are needed. The hard metrics include:

  • The number of views or services planned
  • The savings per view or service
  • The percentage reuse factor for a specified time frame

A major investment bank wanted to build new applications faster, but it couldn't because key reference data, such as counter-party accounts, was duplicated across multiple applications. Other than slowing development, this proliferation contradicted good banking practices and data governance. The bank shaved 25% off its average development time by creating a shared data services library to house Web Services for sharing counter-party master reference data.

Easy Installation and Reliable Operation
Various installation wizards along with manager and clustering capabilities accelerate and automate the installation and runtime of data virtualization solutions. As a result, new solutions are deployed faster. The hard metrics include:

  • The number of months a project can be delivered earlier
  • The assignable value associated with each month the value was accelerated

A leading life sciences R&D organization needed to quickly prototype, develop, and deploy the new information solutions required to support strategic decisions by business executives. It used data virtualization to build and deploy virtual data marts in support of multiple data consumers including Microsoft SharePoint, Business Objects Business Intelligence, TIBCO Spotfire, Microsoft Excel, and various Web portals. This resulted in a 90% reduction in the time required to deploy new information sets.

High-Performance Data Delivery
Data virtualization's query optimization and caching capabilities help eliminate data latencies, speeding the delivery of critical information to users and applications, thereby shortening the time between business events and response.

A North American telecom chip maker targeted faster responses to customer requests. To do this, its sales force management analytics required up-to-the-minute data from the packaged Salesforce.com CRM application as well as other systems. The manufacturer used data virtualization to optimize query performance, ultimately cutting average report runtimes from four minutes to 30 seconds or less.

Technology Savings
Just as storage, server, and applications' virtualization have demonstrated huge technology savings, data virtualization has proven to provide similar savings by requiring fewer physical data repositories along with the systems required to operate and manage them. Many users find that these technology infrastructure savings alone justify the investment. This is frequently a natural place to start a data virtualization deployment.

Fewer Physical Repositories, Lower Hardware, Software, and Facilities Costs
Data virtualization doesn't require replication in intermediate physical data repositories. Fewer physical data marts and operational data stores mean less supporting hardware and software. This, in turn, means less rack space, electricity, air conditioning, management software, and other facilities' costs. The hard metrics include:

  • The number of servers reduced due to virtual federation rather than physical consolidation
  • The assignable hardware, software, and facilities cost associated with each server

A leading computer maker wanted to reduce the cost of its supply chain and customer management operational BI applications, which included more than 50 intermediate data marts. Each mart required a server, resulting in lifecycle hardware infrastructure costs of $20,000 each. It used data virtualization to provide a virtual supply chain data hub that replaced the physical data marts. This resulted in $1 million in infrastructure cost savings.

Staff Savings
The cost of internal and outsourced staff is typically the largest IT expenditure. Substituting automated tools for labor is one strategy for cutting staffing costs. Increasing existing staff productivity by improving SDLC processes is another. Simplifying the work to employ lower-skilled (and therefore less costly) staff is a third. Data virtualization supports these strategies by delivering a user-friendly GUI development environment, along with automated data discovery and data services library capabilities.

Fewer Skills Required
Data virtualization's GUI development environment simplifies and automates the detailed design and development work that would otherwise require more technically capable staff. Enterprises find they spend less money on expensive consultants. Enterprises also find that they can redeploy their highly skilled staff to other more critical work. The hard metrics include:

  • The number of consulting staff hours being reduced
  • The cost per hour

A global money manager wanted to reduce the effort required by 100 financial analysts who build the complex portfolio models used by fund managers. Its solution was to build a virtualization layer surrounding the warehouse to abstract away the complexity of the underlying data. This simplification resulted in a financial analyst productivity increase of 25%, allowing many to be redeployed to develop additional financial analytics useful to the firm.

Greater Collaboration
Discovery capabilities within today's data virtualization platforms quickly validate new reporting solutions with end users using live data early in the development process. Furthermore, analysts and data designers can transfer models to application developers and operations teams to complete the process. This encourages team collaboration and helps reduce rework. The hard metrics include:

  • The number of hours saved
  • The cost per hour

The same global money management firm cited above wanted to improve the collaboration of its 100 financial analysts. Many of its financial models relied on similar data and data models, but technology hindered these analysts from effectively sharing their work. A common virtualization layer over the financial research data warehouse provided the financial analysts with reusable data views that could be shared for the first time. In addition, IT provided a dedicated DBA and data architect who created the new views as needed. The improved collaboration resulted in higher portfolio returns and a 150% ROI in six months.

Metrics for Hard Times
By understanding the specific contributing value factors of data virtualization, C-level executives and IT managers can more easily calculate both the estimated and actual value of each data virtualization implementation under consideration, be it architecture or project. The resulting hard metrics that clearly contribute to enterprise-wide goals arm budget decision makers with the data they need to make confident decisions in hard economic times.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust IT industrialization – allowing customers to provide amazing user experiences with optimized IT per...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
2015 predictions circa 1970: houses anticipate our needs and adapt, city infrastructure is citizen and situation aware, office buildings identify and preprocess you. Today smart buildings have no such collective conscience, no shared set of fundamental services to identify, predict and synchronize around us. LiveSpace and M2Mi are changing that. LiveSpace Smart Environment devices deliver over the M2Mi IoT Platform real time presence, awareness and intent analytics as a service to local connected devices. In her session at @ThingsExpo, Sarah Cooper, VP Business of Development at M2Mi, will d...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In this session, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, will describe how to revolutionize your architecture and...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
We’re entering a new era of computing technology that many are calling the Internet of Things (IoT). Machine to machine, machine to infrastructure, machine to environment, the Internet of Everything, the Internet of Intelligent Things, intelligent systems – call it what you want, but it’s happening, and its potential is huge. IoT is comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures. As a result, huge volumes of data are being generated, and that data is being processed into useful actions that can “command and control” thi...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
Thanks to widespread Internet adoption and more than 10 billion connected devices around the world, companies became more excited than ever about the Internet of Things in 2014. Add in the hype around Google Glass and the Nest Thermostat, and nearly every business, including those from traditionally low-tech industries, wanted in. But despite the buzz, some very real business questions emerged – mainly, not if a device can be connected, or even when, but why? Why does connecting to the cloud create greater value for the user? Why do connected features improve the overall experience? And why do...
SYS-CON Events announced today that O'Reilly Media has been named “Media Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York City, NY. O'Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O'Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption by amplifying "faint signals" from the alpha geeks who are creating the future. An active participa...
Imagine a world where targeting, attribution, and analytics are just as intrinsic to the physical world as they currently are to display advertising. Advances in technologies and changes in consumer behavior have opened the door to a whole new category of personalized marketing experience based on direct interactions with products. The products themselves now have a voice. What will they say? Who will control it? And what does it take for brands to win in this new world? In his session at @ThingsExpo, Zack Bennett, Vice President of Customer Success at EVRYTHNG, will answer these questions a...
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
The multi-trillion economic opportunity around the "Internet of Things" (IoT) is emerging as the hottest topic for investors in 2015. As we connect the physical world with information technology, data from actions, processes and the environment can increase sales, improve efficiencies, automate daily activities and minimize risk. In his session at @ThingsExpo, Ed Maguire, Senior Analyst at CLSA Americas, will describe what is new and different about IoT, explore financial, technological and real-world impact across consumer and business use cases. Why now? Significant corporate and venture...
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...