Welcome!

Agile Computing Authors: ManageEngine IT Matters, Liz McMillan, Lori MacVittie, Elizabeth White, Srinivasan Sundara Rajan

Related Topics: Containers Expo Blog, Government Cloud

Containers Expo Blog: Article

Making the Case for Data Virtualization

Hard metrics for hard times

Achieving compelling value from information technology is critical because IT is typically an enterprise or government agency's largest capital expense. Increasing business complexities and technology choices create greater demands for justification when making IT investments.

Cambridge, MA-based analyst firm Forrester Research recently reported that, "Business and government's purchases of computer and communication equipment, software, IT consulting, and integration services and IT outsourcing will decline by 3% on a global basis in 2009 when measured in U.S. dollars, then rise by 9% in 2010."

With smaller budgets, IT must validate purchases by correlating tangible business and IT returns that align with corporate strategic objectives. This validation should come early in the acquisition process as well as after the implementation to demonstrate actual value and justify expanded adoption.

Evaluating data virtualization first requires understanding how it specifically delivers value. This understanding can then be used to calculate value and provide the hard metrics required for hard economic times.

Data Virtualization
Data virtualization is used to integrate data from multiple disparate sources - anywhere across the extended enterprise - for consumption by front-end business solutions, including portals, mashups, reports, applications, and search engines (see Figure 1).

Source: Composite Software, Inc.
Figure 1 Data virtualization at a glance

As middleware technology, data virtualization or virtual data federation has advanced beyond high-performance query or enterprise information integration (EII). As IT architecture, data virtualization is implemented as a virtualized data layer, an information grid, an information fabric, or a data services layer in service-oriented architecture (SOA) environments. It can also be deployed on a project basis, for business intelligence (BI) and reporting, portals and mashups, and industry-focused single views.

Data Virtualization's Five Value Points
The many ways data virtualization delivers value to business functions and IT operations can be categorized as:

  1. Sales Growth
  2. Risk Reduction
  3. Time Savings
  4. Technology Savings
  5. Staff Savings

Converting these to hard metrics requires an understanding of the relationships between specific data virtualization capabilities and the IT and business value they deliver. Value calculations are made using one or more forms of return-on-investment (ROI) calculators. Examples of the five value points and their metrics along with actual customer case studies are provided below.

Sales Growth
As an important indicator of an enterprise's success (or, in the public sector, as an indicator of service growth or mission effectiveness), sales growth results from business strategies such as improved offerings, better customer support, and faster market response. Data virtualization supports these strategies by providing data federation, on-demand data access and delivery, and automated data discovery and modeling.

More Complete Data
Data federation capabilities enable the integration of disparate data on-the-fly without physical data consolidation, making more complete data available to revenue-producing and customer-facing staff for better sales-related business decisions. Hard metrics include:

  • The number of decisions within the project's scope
  • The revenue-enhancing value of improving each decision based on the availability of more complete data

An energy provider used data federation to increase oil production from its 10,000 wells. The data included complex surface, subsurface, and business data in high volumes from many disparate sources. The data virtualization solution federated actionable information to automate maintenance and repair decisions made throughout the day, while relieving key resources for other value-adding tasks. This increased both staff and repair rig productivity, which were key factors in the 10% increase achieved in well revenue performance and efficiency.

Fresher Data
Data virtualization's on-demand data access and delivery capabilities reach difficult-to-access data and deliver it to consuming applications in near real-time. Fresher data means more timely and accurate decision-making, often yielding sales growth. The hard metrics include:

  • The number of decisions in the project's scope
  • The revenue-enhancing value of improving each decision based on the availability of more timely data

A leading marketing information company used on-demand data access and delivery to grow sales by providing its large consumer goods clients with more timely access to its huge collection of consumer trends and demand information. The data virtualization layer enabled simplified and rapid development of the real-time queries required by the customers' self-service reporting tools. This capability was the key factor behind a 2% increase in revenue.

Quicker Time-to-Solution
Data virtualization's automated data discovery and rapid modeling capabilities reduce the time typically wasted on searching for relationships among data tables. These capabilities automate many of the detailed modeling and validation activities. With quicker time-to-solution, new sales-impacting applications and their associated revenues are available sooner. The hard metrics include:

  • The number of months the project can be delivered earlier
  • The revenue-enhancing value associated with each month where value was accelerated

An investment bank used data discovery and modeling to increase revenues by improving its trade order management, debt/equity market research, and risk management applications. The abstracted data layer in the SOA environment enabled rapid modeling and complex query creation that was shareable across the bank. The resulting 60% reduction in integration design and development time on revenue-enabling applications and portals contributed to a 2% revenue increase at the bank.

Risk Reduction
Risk reduction has become increasingly important as a result of greater complexity and regulation. Becoming more agile in response to risk, improving predictability in light of risk, and ensuring compliance with changing regulations and reporting mandates are a few of the strategies to reduce risk. Data virtualization supports these strategies through its data federation, on-demand data access and delivery, and data discovery and modeling capabilities.

These data virtualization capabilities and IT benefits are similar to those driving sales growth. However, for risk reduction, the business benefit is better risk visibility and faster problem remediation. In both cases, quicker time-to-solution helps get new or improved applications online faster. However, in the case of risk reduction, these might be applications for risk management or compliance reporting, rather than sales or customer management.

More Complete Data
Data federation provides more complete data to risk and compliance management staff, thereby improving data visibility and reducing overall risk. The hard metrics include:

  • The number of risk decisions in the project's scope
  • The risk reduction value of improving each decision based on the availability of more complete data

A global pharmaceutical company used data federation to shorten lengthy R&D cycles and reduce the risk of new product delays. Its Research Scientists' Workbench solution combined disparate structured and semi-structured research data from across the enterprise. Armed with more complete information, researchers were able to resolve problems faster, resulting in 60% fewer new product delays.

Fresher Data and Quicker Time-to-Solution
Data virtualization's on-demand data access and delivery capabilities improve the timeliness of data so risk issues can be remediated faster. Data virtualization's automated discovery and modeling accelerates new risk management and compliance reporting application development, thereby delivering their associated risk-reducing benefits sooner.

Time Savings
New information systems must deliver the data needed while reducing the latency between business event and response. So IT is under constant pressure to provide these new systems and their associated information more quickly. Strategies for saving development and deployment time as well as decreasing data latency are crucial. Data virtualization supports these strategies by providing a data services library; installation wizard, manager and clustering; and query optimization capabilities.

Less New Code, Greater Reuse
The data services library holds complete collections of reusable views and data services. By using these existing objects, the need for new coding efforts is lessened, permitting greater reuse across multiple applications. This saves project development time for both new applications and existing ones when changes are needed. The hard metrics include:

  • The number of views or services planned
  • The savings per view or service
  • The percentage reuse factor for a specified time frame

A major investment bank wanted to build new applications faster, but it couldn't because key reference data, such as counter-party accounts, was duplicated across multiple applications. Other than slowing development, this proliferation contradicted good banking practices and data governance. The bank shaved 25% off its average development time by creating a shared data services library to house Web Services for sharing counter-party master reference data.

Easy Installation and Reliable Operation
Various installation wizards along with manager and clustering capabilities accelerate and automate the installation and runtime of data virtualization solutions. As a result, new solutions are deployed faster. The hard metrics include:

  • The number of months a project can be delivered earlier
  • The assignable value associated with each month the value was accelerated

A leading life sciences R&D organization needed to quickly prototype, develop, and deploy the new information solutions required to support strategic decisions by business executives. It used data virtualization to build and deploy virtual data marts in support of multiple data consumers including Microsoft SharePoint, Business Objects Business Intelligence, TIBCO Spotfire, Microsoft Excel, and various Web portals. This resulted in a 90% reduction in the time required to deploy new information sets.

High-Performance Data Delivery
Data virtualization's query optimization and caching capabilities help eliminate data latencies, speeding the delivery of critical information to users and applications, thereby shortening the time between business events and response.

A North American telecom chip maker targeted faster responses to customer requests. To do this, its sales force management analytics required up-to-the-minute data from the packaged Salesforce.com CRM application as well as other systems. The manufacturer used data virtualization to optimize query performance, ultimately cutting average report runtimes from four minutes to 30 seconds or less.

Technology Savings
Just as storage, server, and applications' virtualization have demonstrated huge technology savings, data virtualization has proven to provide similar savings by requiring fewer physical data repositories along with the systems required to operate and manage them. Many users find that these technology infrastructure savings alone justify the investment. This is frequently a natural place to start a data virtualization deployment.

Fewer Physical Repositories, Lower Hardware, Software, and Facilities Costs
Data virtualization doesn't require replication in intermediate physical data repositories. Fewer physical data marts and operational data stores mean less supporting hardware and software. This, in turn, means less rack space, electricity, air conditioning, management software, and other facilities' costs. The hard metrics include:

  • The number of servers reduced due to virtual federation rather than physical consolidation
  • The assignable hardware, software, and facilities cost associated with each server

A leading computer maker wanted to reduce the cost of its supply chain and customer management operational BI applications, which included more than 50 intermediate data marts. Each mart required a server, resulting in lifecycle hardware infrastructure costs of $20,000 each. It used data virtualization to provide a virtual supply chain data hub that replaced the physical data marts. This resulted in $1 million in infrastructure cost savings.

Staff Savings
The cost of internal and outsourced staff is typically the largest IT expenditure. Substituting automated tools for labor is one strategy for cutting staffing costs. Increasing existing staff productivity by improving SDLC processes is another. Simplifying the work to employ lower-skilled (and therefore less costly) staff is a third. Data virtualization supports these strategies by delivering a user-friendly GUI development environment, along with automated data discovery and data services library capabilities.

Fewer Skills Required
Data virtualization's GUI development environment simplifies and automates the detailed design and development work that would otherwise require more technically capable staff. Enterprises find they spend less money on expensive consultants. Enterprises also find that they can redeploy their highly skilled staff to other more critical work. The hard metrics include:

  • The number of consulting staff hours being reduced
  • The cost per hour

A global money manager wanted to reduce the effort required by 100 financial analysts who build the complex portfolio models used by fund managers. Its solution was to build a virtualization layer surrounding the warehouse to abstract away the complexity of the underlying data. This simplification resulted in a financial analyst productivity increase of 25%, allowing many to be redeployed to develop additional financial analytics useful to the firm.

Greater Collaboration
Discovery capabilities within today's data virtualization platforms quickly validate new reporting solutions with end users using live data early in the development process. Furthermore, analysts and data designers can transfer models to application developers and operations teams to complete the process. This encourages team collaboration and helps reduce rework. The hard metrics include:

  • The number of hours saved
  • The cost per hour

The same global money management firm cited above wanted to improve the collaboration of its 100 financial analysts. Many of its financial models relied on similar data and data models, but technology hindered these analysts from effectively sharing their work. A common virtualization layer over the financial research data warehouse provided the financial analysts with reusable data views that could be shared for the first time. In addition, IT provided a dedicated DBA and data architect who created the new views as needed. The improved collaboration resulted in higher portfolio returns and a 150% ROI in six months.

Metrics for Hard Times
By understanding the specific contributing value factors of data virtualization, C-level executives and IT managers can more easily calculate both the estimated and actual value of each data virtualization implementation under consideration, be it architecture or project. The resulting hard metrics that clearly contribute to enterprise-wide goals arm budget decision makers with the data they need to make confident decisions in hard economic times.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
I'm a lonely sensor. I spend all day telling the world how I'm feeling, but none of the other sensors seem to care. I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I'll soon be flaming. And when all my friends go outside without me, I may be left behind. Don't just log my data; use the relationship graph. In his session at @ThingsExpo, Ryan Boyd, Engi...
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at EMC, will introduce a methodology for capturing, enriching and sharing data (and analytics) across the organizati...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
The vision of a connected smart home is becoming reality with the application of integrated wireless technologies in devices and appliances. The use of standardized and TCP/IP networked wireless technologies in line-powered and battery operated sensors and controls has led to the adoption of radios in the 2.4GHz band, including Wi-Fi, BT/BLE and 802.15.4 applied ZigBee and Thread. This is driving the need for robust wireless coexistence for multiple radios to ensure throughput performance and th...
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, will discuss the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports. The session will include a working demo and a technical d...
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
SYS-CON Events announced today that China Unicom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. China United Network Communications Group Co. Ltd ("China Unicom") was officially established in 2009 on the basis of the merger of former China Netcom and former China Unicom. China Unicom mainly operates a full range of telecommunications services including mobile broadband (GSM, WCDMA, LTE F...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
Digital innovation is the next big wave of business transformation based on digital technologies of which IoT and Big Data are key components, For example: Business boundary innovation is a challenge to excavate third-party business value using IoT and BigData, like Nest Business structure innovation may propose re-building business structure from scratch, as Uber does in the taxicab industry The social model innovation is also a big challenge to the new social architecture with the design fr...
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
IoT is fundamentally transforming the auto industry, turning the vehicle into a hub for connected services, including safety, infotainment and usage-based insurance. Auto manufacturers – and businesses across all verticals – have built an entire ecosystem around the Connected Car, creating new customer touch points and revenue streams. In his session at @ThingsExpo, Macario Namie, Head of IoT Strategy at Cisco Jasper, will share real-world examples of how IoT transforms the car from a static p...
Big Data has been changing the world. IoT fuels the further transformation recently. How are Big Data and IoT related? In his session at @BigDataExpo, Tony Shan, a renowned visionary and thought leader, will explore the interplay of Big Data and IoT. He will anatomize Big Data and IoT separately in terms of what, which, why, where, when, who, how and how much. He will then analyze the relationship between IoT and Big Data, specifically the drilldown of how the 4Vs of Big Data (Volume, Variety,...