|By Robert Eve||
|January 1, 2012 11:00 AM EST||
While the benefits derived from greater business agility are significant, costs are also an important factor to consider. This is especially true in today's extremely competitive business environment and difficult economic times.
This article, the last in a series of three articles on how data virtualization delivers business agility, focuses on resource agility.
In Parts 1 and 2, business decision agility and time-to-solution agility were addressed.
Resource Agility Is a Key Enabler of Business Agility
In the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, resource agility was identified as the third key element in an enterprise's business agility strategy, along with business decision agility and time-to-solution agility.
Data virtualization directly enables greater resource agility through superior developer productivity, lower infrastructure costs and better optimization of data integration solutions.
These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects.
Superior Developer Productivity Saves Personnel Costs
At 41% of the typical enterprise IT budget, personnel staffing expenses, including salaries, benefits and occupancy, represent the largest category of IT spending according to recently published analyst research. This spending is double that of both software and outsourcing, and two-and-a-half times that of hardware.
Not only are these staffing costs high in absolute terms, with data integration efforts often representing half the work in a typical IT development project, data integration developer productivity is critically important on a relative basis as well.
As described in Part 2 of this series, data virtualization uses a streamlined architecture and development approach. Not only does this improve time-to-solution agility, it also improves developer productivity in several ways.
- First, data virtualization allows rapid, iterative development of views and data services. The development and deployment time savings associated with this development approach directly translate into lower staffing costs.
- Second, the typically SQL-based views used in data virtualization are a well-understood IT paradigm. And the IDEs for building these views share common terminology and techniques with the IDEs for the most popular relational databases. The same can be said for data services and popular SOA IDEs. These factors make data virtualization easy for developers to learn and reduce training costs typically required when adopting new tools.
- Third, graphically oriented IDEs simplify data virtualization solution development with significant built-in code generation and automatic query optimization. This enables less senior and lower cost development staff to build data integration solutions.
- Fourth, the views and services built for one application can easily be reused across other applications. This further increases productivity and reduces staffing resource costs.
Better Asset Leverage Lowers Infrastructure Costs
Large enterprises typically have hundreds, if not thousands, of data sources. While these data assets can be leveraged to provide business decision agility, these returns come at a cost. Each source needs to be efficiently operated and managed and the data effectively governed. These ongoing infrastructure costs typically dwarf initial hardware and software implementation costs.
Traditional data integration approaches, where data is consolidated in data warehouses or marts, add to the overall number of data sources. This necessitates not only greater up-front capital expenditures, but also increased spending for ongoing operations and management. In addition, every new copy of the data introduces an opportunity for inconsistency and lower data quality.
Protecting against these inevitable issues is a non-value-added activity that further diverts critical resources. Finally, more sources equal more complexity. This means large, ongoing investments in coordination and synchronization activities.
These demands consume valuable resources that can be significantly reduced through the use of data virtualization. Because data virtualization requires fewer physical data repositories than traditional data integration approaches, enterprises that use data virtualization lower their capital expenditures as well as their operating, management and governance costs. In fact, many data virtualization users find these infrastructure savings alone can justify their entire investment in data virtualization technology.
Add Data Virtualization to Optimize Your Data Integration Portfolio
As a component of a broad data integration portfolio, data virtualization joins traditional data integration approaches such as data consolidation in the form of data warehouses and marts enabled by ETL as well as messaging and replication-based approaches that move data from one location to another.
Each of these approaches has strengths and limitations when addressing various business information needs, data source and consumer technologies, time-to-solution and resource agility requirements.
For example, a data warehouse approach to integration is often deployed when analyzing historical time-series data across multiple dimensions. Data virtualization is typically adopted to support one or more of the five popular data virtualization usage patterns:
- BI data federation
- Data warehouse extension
- Enterprise data virtualization layer
- Big data integration
- Cloud data integration
Given the many information needs, integration challenges, and business agility objectives organizations have to juggle, each data integration approach added to the portfolio improves the organization's data integration flexibility and thus optimizes the ability to deliver effective data integration solutions.
With data virtualization in the integration portfolio, the organization can optimally mix and match physical and virtual integration methods based on the distinct requirements of a specific application's information needs, source data characteristics and other critical factors such as time-to-solution, data latency and total cost of ownership.
In addition, data virtualization provides the opportunity to refactor and optimize data models that are distributed across multiple applications and consolidated stores. For example, many enterprises use their BI tool's semantic layer and/or data warehouse schema to manage data definitions and models. Data virtualization provides the option to centralize this key functionality in the data virtualization layer. This can be especially useful in cases where the enterprise has several BI tools and/or multiple warehouses and marts, each with their own schemas and governance.
Data virtualization's streamlined architecture and development approach significantly improves developer productivity. Further, data virtualization requires fewer physical data repositories than traditional data integration approaches. This means that data virtualization users lower their capital expenditures as well as their operating, management and governance costs. Finally, adding data virtualization to the integration portfolio enables the optimization of physical and virtual integration methods.
These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects in the pursuit of business agility.
• • •
Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.
- The Odd Couple: Marrying Agile and Waterfall
- Fanning the Flames of Agile
- Internet of @ThingsExpo Silicon Valley Call for Papers Now Open
- MangoApps to Exhibit at Cloud Expo New York
- WSO2 Introduces Industry’s First Enterprise Identity Bus With the Launch of WSO2 Identity Server 5.0
- Last Chance to Register for LTE World Summit
- The Butterfly Effect Within IT
- Stay Current on the Internet of Things
- The Business Challenges Impacting Digital Transformation
- Setting the Bar for Agile Architecture
- New Relic Announces General Availability of Real-Time Analytics Platform New Relic Insights
- IoT: I Don't Care How Big It Is!
- How to Get the Best From Virtual Employees
- Global Financial Firms Can Effectively Address Technology Risk Guidelines
- .CLUB Domain Name Extension Now Available for General Registration
- AMAG, HP, ImageWare Systems, March Networks and StrikeForce Discuss Security Solutions in SecuritySolutionsWatch.com Interviews
- MapR Technologies Announces Upcoming June Conferences
- More Mainstream Businesses Depend on Open Source
- F5 to Present at Upcoming Technology and Investor Conferences
- The Odd Couple: Marrying Agile and Waterfall
- Flexera Software’s InstallShield 2014 Release Introduces New Support of Cloud and Virtualised Installations, High-DPI Displays and Touch Devices, and Agile Development
- FlexNet Manager Suite Wins CODiE Award for Best Asset Management Solution - 4th CODiE Award for Flexera Software
- Fanning the Flames of Agile
- WSO2 Guest Speakers at WSO2Con Europe 2014 Will Examine Technology Developments and Best Practices Enabling the Connected Business
- The Top 150 Players in Cloud Computing
- Who Are The All-Time Heroes of i-Technology?
- Where Are RIA Technologies Headed in 2008?
- Success, Arrogance, Rise and Fall
- AJAX World RIA Conference & Expo Kicks Off in New York City
- The Top 250 Players in the Cloud Computing Ecosystem
- Personal Branding Checklist
- i-Technology Viewpoint: Attack of the Blogs
- Exclusive Q&A with Jeff Haynie, Co-Founder & CEO, Appcelerator
- Cloud People: A Who's Who of Cloud Computing
- Ulitzer Names the World's 30 Most Influential Cloud Computing Bloggers
- Web 2.0 News and Wrapping Up "Real-World AJAX" Seminar