Welcome!

Agile Computing Authors: Liz McMillan, Elizabeth White, SmartBear Blog, Richard Hale, Akhil Sahai

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

How Data Virtualization Improves Business Agility – Part 2

Accelerate value with a streamlined, iterative approach that evolves easily

Business Agility Requires Multiple Approaches
Agile businesses create business agility through a combination of business decision agility, time-to-solution agility and resource agility.

This article addresses how data virtualization delivers time-to-solution agility. Part 1 addressed business decision agility and Part 3 will address resource agility.

Time-To-Solution Agility = Business Value
When responding to new information needs, rapid time-to-solution is critically important and often results in significant bottom-line benefits.

Proven, time and again across multiple industries, substantial time-to-solution improvements can be seen in the ten case studies described in the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.

Consider This Example: If the business wants to enter a new market, it must first financially justify the investment, including any new IT requirements. Thus, only the highest ROI projects are approved and funded. Once the effort is approved, accelerating delivery of the IT solution also accelerates realization of the business benefits and ROI.

Therefore, if incremental revenues from the new market are $2 million per month, then the business will gain an additional $2 million for every month IT can save in time needed to deliver the solution.

Streamlined Approach to Data Integration
Data virtualization is significantly more agile and responsive than traditional data consolidation and ETL-based integration approaches because it uses a highly streamlined architecture and development process to build and deploy data integration solutions.

This approach greatly reduces complexity and reduces or eliminates the need for data replication and data movement. As numerous data virtualization case studies demonstrate, this elegance of design and architecture makes it far easier and faster to develop and deploy data integration solutions using a data virtualization platform. The ultimate result is faster realization of business benefits.

To better understand the difference, let's contrast these methods. In both the traditional data warehouse/ETL approach and data virtualization, understanding the information requirements and reporting schema is the common first step.

Traditional Data Integration Has Many Moving Parts
Using the traditional approach IT then models and implements the data warehouse schema. ETL development follows to create the links between the sources and the warehouse. Finally the ETL scripts are run to populate the warehouse. The metadata, data models/schemas and development tools used within each activity are unique to each activity.

This diverse environment of different metadata, data models/schemas and development tools is not only complex but also results in the need to coordinate and synchronize efforts and objects across them.

Experienced BI and data integration users will readily acknowledge the long development times that result from this complexity, including Forrester Research in its 2011 report Data Virtualization Reaches Critical Mass.

"Extract, transform, and load (ETL) approaches require one or more copies of data staged along the physical integration process flow. Creating, storing, and manipulating these copies can be complex and error prone."

Data Virtualization Has Fewer Moving Parts
Data virtualization uses a more streamlined architecture that simplifies development. Once the information requirements and reporting schema are understood, the next step is to develop the objects (views and data services) used to both model and query the required data.

These virtual equivalents of the warehouse schema and ETL routines and scripts are created within a single view or data service object using a unified data virtualization development environment. This approach leverages the same metadata, data models/schemas and tools.

Not only is it easier to build the data integration layer using data virtualization, but there are also fewer "moving parts," which reduces the need for coordination and synchronization activities. With data virtualization, there is no need to physically migrate data from the sources to a warehouse. The only data that is moved is the data delivered directly from the source to the consumer on-demand. These result sets persist in the data virtualization server's memory for only a short interval.

Avoiding data warehouse loads, reloads and updates further simplifies and streamlines solution deployment and thereby improves time-to-solution agility.

Iterative Development Process Is Better for Business Users
Another way data virtualization improves time-to-solution agility is through support for a fast, iterative development approach. Here, business users and IT collaborate to quickly define the initial solution requirements followed by an iterative "develop, get feedback and refine" process until the solution meets the user need.

Most users prefer this type of development process. Because building views of existing data is simple and fast, IT can provide business users with prospective versions of new data sets in just a few hours. The user doesn't have to wait months for results while IT develops detailed solution requirements. Then business users can react to these data sets and refine their requirements based on the tangible insights. IT can then change the views and show the refined data sets to the business users.

This iterative development approach enables the business and IT to hone in on and deliver the needed information much faster than traditional integration methods.

Even in cases where a data warehouse solution is mandated by specific analytic needs, data virtualization can be used to support rapid prototyping of the solution. The initial solution is built using data virtualization's iterative development approach, with migration to the data warehouse approach once the business is fully satisfied with the information delivered.

In contrast, developing a new information solution using traditional data integration architecture is inherently more complex. Typically, business users must fully and accurately specify their information requirements prior to any development, with little change tolerated. Not only does the development process take longer, but there is a real risk that the resulting solution will not be what the users actually need and want.

Data virtualization offers significant value, and the opportunity to reduce risk and cost, by enabling IT to quickly deliver iterative results that enable users to truly understand what their real information needs are and get a solution that meets those needs.

Ease of Data Virtualization Change Keeps Pace with Business Change
The third way data virtualization improves time-to-solution agility is ease of change. Information needs evolve. So do the associated source systems and consuming applications. Data virtualization allows a more loosely coupled architecture between sources, consumers and the data virtualization objects and middleware that integrate them.

This level of independence makes it significantly easier to extend and adapt existing data virtualization solutions as business requirements or associated source and consumer system implementations change. In fact, changing an existing view, adding a new source or migrating from one source to another is often completed in hours or days, versus weeks or months in the traditional approach.

Conclusion
Data virtualization reduces complexity, data replication and data movement. Business users and IT collaborate to quickly define the initial solution requirements followed by an iterative "develop, get feedback and refine" delivery process. Further independent layers make it significantly easier to extend and adapt existing data virtualization solutions as business requirements or associated source and consumer system implementations change.

These time-to-solution accelerators, as numerous data virtualization case studies demonstrate, make it far easier and faster to develop and deploy data integration solutions using a data virtualization platform than other approaches. The result is faster realization of business benefits.

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
It’s 2016: buildings are smart, connected and the IoT is fundamentally altering how control and operating systems work and speak to each other. Platforms across the enterprise are networked via inexpensive sensors to collect massive amounts of data for analytics, information management, and insights that can be used to continuously improve operations. In his session at @ThingsExpo, Brian Chemel, Co-Founder and CTO of Digital Lumens, will explore: The benefits sensor-networked systems bring to ...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...