It wouldn’t take a genius to notice the evolution of modern technology. In just the past ten years, we’ve watched the flip-phone transform into the smartphone and the automobile inch towards autonomy. Within our own space, we’ve noticed similar shifts, as the resurgent process of data virtualization is growing in importance within the larger data integration market. To further understand the nature of this shift, we take to Gartner:
“As data integration architectures continue to shift from physical bulk/batch movement to virtualized and real-time granular data delivery, data and analytics leaders must intertwine integration styles to match all requirements of business changes.”1
It appears as though a similar evolution is underway, as a combination of data integration styles are now required to meet the rigorous demands of being a 21st century corporation. Specifically, it appears that the proper combination of integration styles would be ETL and data virtualization, as its pairing yields a hybrid approach capable of handling bulk movements, while opening the doors to real-time data access.
Fundamentally, both businesses and technology are rapidly changing. Where businesses used to rely on on-premise data, recent times have brought forth a shift to cloud based data. Due to this shift, a multifaceted approach to data delivery is required, hence, the hybrid approach brought forth by ETL and data virtualization. These advances come to represent this technological shift, one that Gartner notes, has businesses getting in line to follow.
“A hybrid approach to data integration is growing in popularity because, among various advantages, it provides the ability to integrate data from, and execute data in, both cloud and on-premises environments, when necessary.”1
As a data virtualization provider, we of course, are seeing drastic increases in customer momentum. With double digit growth rates and the rise of modern data architectures such as the logical data warehouse, it’s clear that the market is truly shifting towards modernization. Time after time, we find ourselves enabling this hybrid architecture, as all businesses are in search of a versatile data delivery platform, apt to tackle the volatility of business cycles. Along the way, these businesses are also quick to realize a number of benefits including cost and time savings, resource optimization, and a unified view of data among many others. Such benefits bring us to Gartner’s overarching recommendation to consumers:
“Identify and implement a portfolio-based approach to your integration strategy that extends beyond consolidating data via extraction, transformation, and loading (ETL) to include real-time flows, event recognition, and data virtualization.”1
All in all, it’s the flexibility and capability of this approach that attracts businesses. Gartner’s release of the Data Virtualization Market Guide will shed more light on how to tread through these waters, but until then I’d suggest jumping on board this technological revolution.
1 The State and Future of Data Integration: Optimizing Your Portfolio of Tools to Harness Market Shifts, May 2016
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
- Active Data Architecture: The Need of the Hour - October 3, 2024
- Denodo Recognized as a Leader in Enterprise Data Fabric Evaluation by Independent Analyst Firm - February 29, 2024
- Enhancing Regulatory Compliance through Logical Data Management - August 17, 2023