|By Robert Eve||
|June 8, 2009 04:45 PM EDT||
Achieving compelling value from information technology is critical because IT is typically an enterprise or government agency's largest capital expense. Increasing business complexities and technology choices create greater demands for justification when making IT investments.
Cambridge, MA-based analyst firm Forrester Research recently reported that, "Business and government's purchases of computer and communication equipment, software, IT consulting, and integration services and IT outsourcing will decline by 3% on a global basis in 2009 when measured in U.S. dollars, then rise by 9% in 2010."
With smaller budgets, IT must validate purchases by correlating tangible business and IT returns that align with corporate strategic objectives. This validation should come early in the acquisition process as well as after the implementation to demonstrate actual value and justify expanded adoption.
Evaluating data virtualization first requires understanding how it specifically delivers value. This understanding can then be used to calculate value and provide the hard metrics required for hard economic times.
Data virtualization is used to integrate data from multiple disparate sources - anywhere across the extended enterprise - for consumption by front-end business solutions, including portals, mashups, reports, applications, and search engines (see Figure 1).
Source: Composite Software, Inc.
Figure 1 Data virtualization at a glance
As middleware technology, data virtualization or virtual data federation has advanced beyond high-performance query or enterprise information integration (EII). As IT architecture, data virtualization is implemented as a virtualized data layer, an information grid, an information fabric, or a data services layer in service-oriented architecture (SOA) environments. It can also be deployed on a project basis, for business intelligence (BI) and reporting, portals and mashups, and industry-focused single views.
Data Virtualization's Five Value Points
The many ways data virtualization delivers value to business functions and IT operations can be categorized as:
- Sales Growth
- Risk Reduction
- Time Savings
- Technology Savings
- Staff Savings
Converting these to hard metrics requires an understanding of the relationships between specific data virtualization capabilities and the IT and business value they deliver. Value calculations are made using one or more forms of return-on-investment (ROI) calculators. Examples of the five value points and their metrics along with actual customer case studies are provided below.
As an important indicator of an enterprise's success (or, in the public sector, as an indicator of service growth or mission effectiveness), sales growth results from business strategies such as improved offerings, better customer support, and faster market response. Data virtualization supports these strategies by providing data federation, on-demand data access and delivery, and automated data discovery and modeling.
More Complete Data
Data federation capabilities enable the integration of disparate data on-the-fly without physical data consolidation, making more complete data available to revenue-producing and customer-facing staff for better sales-related business decisions. Hard metrics include:
- The number of decisions within the project's scope
- The revenue-enhancing value of improving each decision based on the availability of more complete data
An energy provider used data federation to increase oil production from its 10,000 wells. The data included complex surface, subsurface, and business data in high volumes from many disparate sources. The data virtualization solution federated actionable information to automate maintenance and repair decisions made throughout the day, while relieving key resources for other value-adding tasks. This increased both staff and repair rig productivity, which were key factors in the 10% increase achieved in well revenue performance and efficiency.
Data virtualization's on-demand data access and delivery capabilities reach difficult-to-access data and deliver it to consuming applications in near real-time. Fresher data means more timely and accurate decision-making, often yielding sales growth. The hard metrics include:
- The number of decisions in the project's scope
- The revenue-enhancing value of improving each decision based on the availability of more timely data
A leading marketing information company used on-demand data access and delivery to grow sales by providing its large consumer goods clients with more timely access to its huge collection of consumer trends and demand information. The data virtualization layer enabled simplified and rapid development of the real-time queries required by the customers' self-service reporting tools. This capability was the key factor behind a 2% increase in revenue.
Data virtualization's automated data discovery and rapid modeling capabilities reduce the time typically wasted on searching for relationships among data tables. These capabilities automate many of the detailed modeling and validation activities. With quicker time-to-solution, new sales-impacting applications and their associated revenues are available sooner. The hard metrics include:
- The number of months the project can be delivered earlier
- The revenue-enhancing value associated with each month where value was accelerated
An investment bank used data discovery and modeling to increase revenues by improving its trade order management, debt/equity market research, and risk management applications. The abstracted data layer in the SOA environment enabled rapid modeling and complex query creation that was shareable across the bank. The resulting 60% reduction in integration design and development time on revenue-enabling applications and portals contributed to a 2% revenue increase at the bank.
Risk reduction has become increasingly important as a result of greater complexity and regulation. Becoming more agile in response to risk, improving predictability in light of risk, and ensuring compliance with changing regulations and reporting mandates are a few of the strategies to reduce risk. Data virtualization supports these strategies through its data federation, on-demand data access and delivery, and data discovery and modeling capabilities.
These data virtualization capabilities and IT benefits are similar to those driving sales growth. However, for risk reduction, the business benefit is better risk visibility and faster problem remediation. In both cases, quicker time-to-solution helps get new or improved applications online faster. However, in the case of risk reduction, these might be applications for risk management or compliance reporting, rather than sales or customer management.
More Complete Data
Data federation provides more complete data to risk and compliance management staff, thereby improving data visibility and reducing overall risk. The hard metrics include:
- The number of risk decisions in the project's scope
- The risk reduction value of improving each decision based on the availability of more complete data
A global pharmaceutical company used data federation to shorten lengthy R&D cycles and reduce the risk of new product delays. Its Research Scientists' Workbench solution combined disparate structured and semi-structured research data from across the enterprise. Armed with more complete information, researchers were able to resolve problems faster, resulting in 60% fewer new product delays.
Fresher Data and Quicker Time-to-Solution
Data virtualization's on-demand data access and delivery capabilities improve the timeliness of data so risk issues can be remediated faster. Data virtualization's automated discovery and modeling accelerates new risk management and compliance reporting application development, thereby delivering their associated risk-reducing benefits sooner.
New information systems must deliver the data needed while reducing the latency between business event and response. So IT is under constant pressure to provide these new systems and their associated information more quickly. Strategies for saving development and deployment time as well as decreasing data latency are crucial. Data virtualization supports these strategies by providing a data services library; installation wizard, manager and clustering; and query optimization capabilities.
Less New Code, Greater Reuse
The data services library holds complete collections of reusable views and data services. By using these existing objects, the need for new coding efforts is lessened, permitting greater reuse across multiple applications. This saves project development time for both new applications and existing ones when changes are needed. The hard metrics include:
- The number of views or services planned
- The savings per view or service
- The percentage reuse factor for a specified time frame
A major investment bank wanted to build new applications faster, but it couldn't because key reference data, such as counter-party accounts, was duplicated across multiple applications. Other than slowing development, this proliferation contradicted good banking practices and data governance. The bank shaved 25% off its average development time by creating a shared data services library to house Web Services for sharing counter-party master reference data.
Easy Installation and Reliable Operation
Various installation wizards along with manager and clustering capabilities accelerate and automate the installation and runtime of data virtualization solutions. As a result, new solutions are deployed faster. The hard metrics include:
- The number of months a project can be delivered earlier
- The assignable value associated with each month the value was accelerated
A leading life sciences R&D organization needed to quickly prototype, develop, and deploy the new information solutions required to support strategic decisions by business executives. It used data virtualization to build and deploy virtual data marts in support of multiple data consumers including Microsoft SharePoint, Business Objects Business Intelligence, TIBCO Spotfire, Microsoft Excel, and various Web portals. This resulted in a 90% reduction in the time required to deploy new information sets.
High-Performance Data Delivery
Data virtualization's query optimization and caching capabilities help eliminate data latencies, speeding the delivery of critical information to users and applications, thereby shortening the time between business events and response.
A North American telecom chip maker targeted faster responses to customer requests. To do this, its sales force management analytics required up-to-the-minute data from the packaged Salesforce.com CRM application as well as other systems. The manufacturer used data virtualization to optimize query performance, ultimately cutting average report runtimes from four minutes to 30 seconds or less.
Just as storage, server, and applications' virtualization have demonstrated huge technology savings, data virtualization has proven to provide similar savings by requiring fewer physical data repositories along with the systems required to operate and manage them. Many users find that these technology infrastructure savings alone justify the investment. This is frequently a natural place to start a data virtualization deployment.
Fewer Physical Repositories, Lower Hardware, Software, and Facilities Costs
Data virtualization doesn't require replication in intermediate physical data repositories. Fewer physical data marts and operational data stores mean less supporting hardware and software. This, in turn, means less rack space, electricity, air conditioning, management software, and other facilities' costs. The hard metrics include:
- The number of servers reduced due to virtual federation rather than physical consolidation
- The assignable hardware, software, and facilities cost associated with each server
A leading computer maker wanted to reduce the cost of its supply chain and customer management operational BI applications, which included more than 50 intermediate data marts. Each mart required a server, resulting in lifecycle hardware infrastructure costs of $20,000 each. It used data virtualization to provide a virtual supply chain data hub that replaced the physical data marts. This resulted in $1 million in infrastructure cost savings.
The cost of internal and outsourced staff is typically the largest IT expenditure. Substituting automated tools for labor is one strategy for cutting staffing costs. Increasing existing staff productivity by improving SDLC processes is another. Simplifying the work to employ lower-skilled (and therefore less costly) staff is a third. Data virtualization supports these strategies by delivering a user-friendly GUI development environment, along with automated data discovery and data services library capabilities.
Fewer Skills Required
Data virtualization's GUI development environment simplifies and automates the detailed design and development work that would otherwise require more technically capable staff. Enterprises find they spend less money on expensive consultants. Enterprises also find that they can redeploy their highly skilled staff to other more critical work. The hard metrics include:
- The number of consulting staff hours being reduced
- The cost per hour
A global money manager wanted to reduce the effort required by 100 financial analysts who build the complex portfolio models used by fund managers. Its solution was to build a virtualization layer surrounding the warehouse to abstract away the complexity of the underlying data. This simplification resulted in a financial analyst productivity increase of 25%, allowing many to be redeployed to develop additional financial analytics useful to the firm.
Discovery capabilities within today's data virtualization platforms quickly validate new reporting solutions with end users using live data early in the development process. Furthermore, analysts and data designers can transfer models to application developers and operations teams to complete the process. This encourages team collaboration and helps reduce rework. The hard metrics include:
- The number of hours saved
- The cost per hour
The same global money management firm cited above wanted to improve the collaboration of its 100 financial analysts. Many of its financial models relied on similar data and data models, but technology hindered these analysts from effectively sharing their work. A common virtualization layer over the financial research data warehouse provided the financial analysts with reusable data views that could be shared for the first time. In addition, IT provided a dedicated DBA and data architect who created the new views as needed. The improved collaboration resulted in higher portfolio returns and a 150% ROI in six months.
Metrics for Hard Times
By understanding the specific contributing value factors of data virtualization, C-level executives and IT managers can more easily calculate both the estimated and actual value of each data virtualization implementation under consideration, be it architecture or project. The resulting hard metrics that clearly contribute to enterprise-wide goals arm budget decision makers with the data they need to make confident decisions in hard economic times.
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
Aug. 31, 2015 11:00 AM EDT Reads: 158
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
Aug. 31, 2015 10:30 AM EDT Reads: 631
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
Aug. 31, 2015 10:15 AM EDT Reads: 294
Contrary to mainstream media attention, the multiple possibilities of how consumer IoT will transform our everyday lives aren’t the only angle of this headline-gaining trend. There’s a huge opportunity for “industrial IoT” and “Smart Cities” to impact the world in the same capacity – especially during critical situations. For example, a community water dam that needs to release water can leverage embedded critical communications logic to alert the appropriate individuals, on the right device, as soon as they are needed to take action.
Aug. 31, 2015 09:40 AM EDT
In his session at @ThingsExpo, Lee Williams, a producer of the first smartphones and tablets, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. He will explain how M2M controllers work through wirelessly connected remote controls; and specifically delve into a retrofit option that reverse-engineers control codes of existing conventional controller systems so they don't have to be replaced and are instantly converted to become smart, connected devices.
Aug. 31, 2015 09:30 AM EDT Reads: 122
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
Aug. 31, 2015 04:00 AM EDT Reads: 406
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Aug. 31, 2015 02:00 AM EDT Reads: 450
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Aug. 30, 2015 10:00 PM EDT Reads: 348
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
Aug. 30, 2015 05:00 PM EDT Reads: 135
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Aug. 30, 2015 01:30 PM EDT Reads: 224
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Aug. 30, 2015 10:30 AM EDT Reads: 881
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Treloar, President and COO of Bebaio, will explore examples of brands transforming their businesses by t...
Aug. 30, 2015 10:15 AM EDT Reads: 229
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and analyzed? As an area of investment, how might a retail company move towards an innovation methodolo...
Aug. 30, 2015 09:15 AM EDT Reads: 446
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of IoT applications and projects. Business operations, IT, and data scientists need advanced analytics t...
Aug. 30, 2015 08:30 AM EDT Reads: 399
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
Aug. 28, 2015 07:45 PM EDT Reads: 214
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
Aug. 26, 2015 07:00 AM EDT Reads: 133
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducted a live demonstration of how quickly application development can happen when the need to comply wit...
Aug. 2, 2015 11:15 AM EDT Reads: 553
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Aug. 1, 2015 10:00 AM EDT Reads: 480
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, described how to revolutionize your archit...
Jul. 30, 2015 07:30 PM EDT Reads: 1,565
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with APIs within the next year.
Jul. 30, 2015 02:30 PM EDT Reads: 279