Click here to close now.


Agile Computing Authors: SmartBear Blog, Anders Wallgren, Yeshim Deniz, Pat Romanski, Liz McMillan

Related Topics: Containers Expo Blog, Government Cloud

Containers Expo Blog: Article

Making the Case for Data Virtualization

Hard metrics for hard times

Achieving compelling value from information technology is critical because IT is typically an enterprise or government agency's largest capital expense. Increasing business complexities and technology choices create greater demands for justification when making IT investments.

Cambridge, MA-based analyst firm Forrester Research recently reported that, "Business and government's purchases of computer and communication equipment, software, IT consulting, and integration services and IT outsourcing will decline by 3% on a global basis in 2009 when measured in U.S. dollars, then rise by 9% in 2010."

With smaller budgets, IT must validate purchases by correlating tangible business and IT returns that align with corporate strategic objectives. This validation should come early in the acquisition process as well as after the implementation to demonstrate actual value and justify expanded adoption.

Evaluating data virtualization first requires understanding how it specifically delivers value. This understanding can then be used to calculate value and provide the hard metrics required for hard economic times.

Data Virtualization
Data virtualization is used to integrate data from multiple disparate sources - anywhere across the extended enterprise - for consumption by front-end business solutions, including portals, mashups, reports, applications, and search engines (see Figure 1).

Source: Composite Software, Inc.
Figure 1 Data virtualization at a glance

As middleware technology, data virtualization or virtual data federation has advanced beyond high-performance query or enterprise information integration (EII). As IT architecture, data virtualization is implemented as a virtualized data layer, an information grid, an information fabric, or a data services layer in service-oriented architecture (SOA) environments. It can also be deployed on a project basis, for business intelligence (BI) and reporting, portals and mashups, and industry-focused single views.

Data Virtualization's Five Value Points
The many ways data virtualization delivers value to business functions and IT operations can be categorized as:

  1. Sales Growth
  2. Risk Reduction
  3. Time Savings
  4. Technology Savings
  5. Staff Savings

Converting these to hard metrics requires an understanding of the relationships between specific data virtualization capabilities and the IT and business value they deliver. Value calculations are made using one or more forms of return-on-investment (ROI) calculators. Examples of the five value points and their metrics along with actual customer case studies are provided below.

Sales Growth
As an important indicator of an enterprise's success (or, in the public sector, as an indicator of service growth or mission effectiveness), sales growth results from business strategies such as improved offerings, better customer support, and faster market response. Data virtualization supports these strategies by providing data federation, on-demand data access and delivery, and automated data discovery and modeling.

More Complete Data
Data federation capabilities enable the integration of disparate data on-the-fly without physical data consolidation, making more complete data available to revenue-producing and customer-facing staff for better sales-related business decisions. Hard metrics include:

  • The number of decisions within the project's scope
  • The revenue-enhancing value of improving each decision based on the availability of more complete data

An energy provider used data federation to increase oil production from its 10,000 wells. The data included complex surface, subsurface, and business data in high volumes from many disparate sources. The data virtualization solution federated actionable information to automate maintenance and repair decisions made throughout the day, while relieving key resources for other value-adding tasks. This increased both staff and repair rig productivity, which were key factors in the 10% increase achieved in well revenue performance and efficiency.

Fresher Data
Data virtualization's on-demand data access and delivery capabilities reach difficult-to-access data and deliver it to consuming applications in near real-time. Fresher data means more timely and accurate decision-making, often yielding sales growth. The hard metrics include:

  • The number of decisions in the project's scope
  • The revenue-enhancing value of improving each decision based on the availability of more timely data

A leading marketing information company used on-demand data access and delivery to grow sales by providing its large consumer goods clients with more timely access to its huge collection of consumer trends and demand information. The data virtualization layer enabled simplified and rapid development of the real-time queries required by the customers' self-service reporting tools. This capability was the key factor behind a 2% increase in revenue.

Quicker Time-to-Solution
Data virtualization's automated data discovery and rapid modeling capabilities reduce the time typically wasted on searching for relationships among data tables. These capabilities automate many of the detailed modeling and validation activities. With quicker time-to-solution, new sales-impacting applications and their associated revenues are available sooner. The hard metrics include:

  • The number of months the project can be delivered earlier
  • The revenue-enhancing value associated with each month where value was accelerated

An investment bank used data discovery and modeling to increase revenues by improving its trade order management, debt/equity market research, and risk management applications. The abstracted data layer in the SOA environment enabled rapid modeling and complex query creation that was shareable across the bank. The resulting 60% reduction in integration design and development time on revenue-enabling applications and portals contributed to a 2% revenue increase at the bank.

Risk Reduction
Risk reduction has become increasingly important as a result of greater complexity and regulation. Becoming more agile in response to risk, improving predictability in light of risk, and ensuring compliance with changing regulations and reporting mandates are a few of the strategies to reduce risk. Data virtualization supports these strategies through its data federation, on-demand data access and delivery, and data discovery and modeling capabilities.

These data virtualization capabilities and IT benefits are similar to those driving sales growth. However, for risk reduction, the business benefit is better risk visibility and faster problem remediation. In both cases, quicker time-to-solution helps get new or improved applications online faster. However, in the case of risk reduction, these might be applications for risk management or compliance reporting, rather than sales or customer management.

More Complete Data
Data federation provides more complete data to risk and compliance management staff, thereby improving data visibility and reducing overall risk. The hard metrics include:

  • The number of risk decisions in the project's scope
  • The risk reduction value of improving each decision based on the availability of more complete data

A global pharmaceutical company used data federation to shorten lengthy R&D cycles and reduce the risk of new product delays. Its Research Scientists' Workbench solution combined disparate structured and semi-structured research data from across the enterprise. Armed with more complete information, researchers were able to resolve problems faster, resulting in 60% fewer new product delays.

Fresher Data and Quicker Time-to-Solution
Data virtualization's on-demand data access and delivery capabilities improve the timeliness of data so risk issues can be remediated faster. Data virtualization's automated discovery and modeling accelerates new risk management and compliance reporting application development, thereby delivering their associated risk-reducing benefits sooner.

Time Savings
New information systems must deliver the data needed while reducing the latency between business event and response. So IT is under constant pressure to provide these new systems and their associated information more quickly. Strategies for saving development and deployment time as well as decreasing data latency are crucial. Data virtualization supports these strategies by providing a data services library; installation wizard, manager and clustering; and query optimization capabilities.

Less New Code, Greater Reuse
The data services library holds complete collections of reusable views and data services. By using these existing objects, the need for new coding efforts is lessened, permitting greater reuse across multiple applications. This saves project development time for both new applications and existing ones when changes are needed. The hard metrics include:

  • The number of views or services planned
  • The savings per view or service
  • The percentage reuse factor for a specified time frame

A major investment bank wanted to build new applications faster, but it couldn't because key reference data, such as counter-party accounts, was duplicated across multiple applications. Other than slowing development, this proliferation contradicted good banking practices and data governance. The bank shaved 25% off its average development time by creating a shared data services library to house Web Services for sharing counter-party master reference data.

Easy Installation and Reliable Operation
Various installation wizards along with manager and clustering capabilities accelerate and automate the installation and runtime of data virtualization solutions. As a result, new solutions are deployed faster. The hard metrics include:

  • The number of months a project can be delivered earlier
  • The assignable value associated with each month the value was accelerated

A leading life sciences R&D organization needed to quickly prototype, develop, and deploy the new information solutions required to support strategic decisions by business executives. It used data virtualization to build and deploy virtual data marts in support of multiple data consumers including Microsoft SharePoint, Business Objects Business Intelligence, TIBCO Spotfire, Microsoft Excel, and various Web portals. This resulted in a 90% reduction in the time required to deploy new information sets.

High-Performance Data Delivery
Data virtualization's query optimization and caching capabilities help eliminate data latencies, speeding the delivery of critical information to users and applications, thereby shortening the time between business events and response.

A North American telecom chip maker targeted faster responses to customer requests. To do this, its sales force management analytics required up-to-the-minute data from the packaged CRM application as well as other systems. The manufacturer used data virtualization to optimize query performance, ultimately cutting average report runtimes from four minutes to 30 seconds or less.

Technology Savings
Just as storage, server, and applications' virtualization have demonstrated huge technology savings, data virtualization has proven to provide similar savings by requiring fewer physical data repositories along with the systems required to operate and manage them. Many users find that these technology infrastructure savings alone justify the investment. This is frequently a natural place to start a data virtualization deployment.

Fewer Physical Repositories, Lower Hardware, Software, and Facilities Costs
Data virtualization doesn't require replication in intermediate physical data repositories. Fewer physical data marts and operational data stores mean less supporting hardware and software. This, in turn, means less rack space, electricity, air conditioning, management software, and other facilities' costs. The hard metrics include:

  • The number of servers reduced due to virtual federation rather than physical consolidation
  • The assignable hardware, software, and facilities cost associated with each server

A leading computer maker wanted to reduce the cost of its supply chain and customer management operational BI applications, which included more than 50 intermediate data marts. Each mart required a server, resulting in lifecycle hardware infrastructure costs of $20,000 each. It used data virtualization to provide a virtual supply chain data hub that replaced the physical data marts. This resulted in $1 million in infrastructure cost savings.

Staff Savings
The cost of internal and outsourced staff is typically the largest IT expenditure. Substituting automated tools for labor is one strategy for cutting staffing costs. Increasing existing staff productivity by improving SDLC processes is another. Simplifying the work to employ lower-skilled (and therefore less costly) staff is a third. Data virtualization supports these strategies by delivering a user-friendly GUI development environment, along with automated data discovery and data services library capabilities.

Fewer Skills Required
Data virtualization's GUI development environment simplifies and automates the detailed design and development work that would otherwise require more technically capable staff. Enterprises find they spend less money on expensive consultants. Enterprises also find that they can redeploy their highly skilled staff to other more critical work. The hard metrics include:

  • The number of consulting staff hours being reduced
  • The cost per hour

A global money manager wanted to reduce the effort required by 100 financial analysts who build the complex portfolio models used by fund managers. Its solution was to build a virtualization layer surrounding the warehouse to abstract away the complexity of the underlying data. This simplification resulted in a financial analyst productivity increase of 25%, allowing many to be redeployed to develop additional financial analytics useful to the firm.

Greater Collaboration
Discovery capabilities within today's data virtualization platforms quickly validate new reporting solutions with end users using live data early in the development process. Furthermore, analysts and data designers can transfer models to application developers and operations teams to complete the process. This encourages team collaboration and helps reduce rework. The hard metrics include:

  • The number of hours saved
  • The cost per hour

The same global money management firm cited above wanted to improve the collaboration of its 100 financial analysts. Many of its financial models relied on similar data and data models, but technology hindered these analysts from effectively sharing their work. A common virtualization layer over the financial research data warehouse provided the financial analysts with reusable data views that could be shared for the first time. In addition, IT provided a dedicated DBA and data architect who created the new views as needed. The improved collaboration resulted in higher portfolio returns and a 150% ROI in six months.

Metrics for Hard Times
By understanding the specific contributing value factors of data virtualization, C-level executives and IT managers can more easily calculate both the estimated and actual value of each data virtualization implementation under consideration, be it architecture or project. The resulting hard metrics that clearly contribute to enterprise-wide goals arm budget decision makers with the data they need to make confident decisions in hard economic times.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
NHK, Japan Broadcasting will feature upcoming @ThingsExpo Silicon Valley in a special IoT documentary which will be filmed on the expo floor November 3 to 5, 2015 in Santa Clara. NHK is the sole public TV network in Japan equivalent to BBC in UK and the largest in Asia with many award winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology covering @ThingsExpo Silicon Valley. The program will be aired during the highest viewership season of the year that it will have a high impact in the industry through this documentary in Japan. The film...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
SYS-CON Events announced today that Luxoft Holding, Inc., a leading provider of software development services and innovative IT solutions, has been named “Bronze Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Luxoft’s software development services consist of core and mission-critical custom software development and support, product engineering and testing, and technology consulting.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT influence Big Data: Device, Connectivity, Context, and Intelligence. He will dive deep to the matrix...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Building actually breathes - immediately flagging overheating in a closet or over cooling in unoccupied ho...
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect and correlate data from control systems, sensors, mobile devices and IT systems for a variety of Ind...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...