As organizations continue to pursue increasingly time-sensitive use-cases including customer 360° views, supply-chain logistics, and healthcare monitoring, they need their supporting data infrastructures to be increasingly flexible, adaptable, and scalable. Howard Dresner refers to such infrastructures as “active data architecture.” Dresner not only coined the term “business intelligence (BI)” but also led the Gartner BI practice during its foundational years, and active data architecture is now a key area of focus for him as founder and principal of Dresner Advisory Services.
In the 2024 Active Data Architecture™ report released by Dresner Advisory Services, “active data architecture” is defined as that which supports “a platform-independent layer that sits between physical data stores and points of data consumption. It is comprised of various data management capabilities including virtualized and distributed data access, data governance, and security. At its core, Active Data Architecture is an abstraction layer translating business and physical structures. It is an architecture dynamically optimized for performance, scalability, and cost management.”
Without an active data architecture, organizations cannot easily:
- Support many of today’s demanding use cases, including the ones mentioned above
- Adapt to sudden disruptions such as market, technological, or economic changes
- Scale without impacting such critical processes as product/service delivery, customer service, and go-to-market initiatives
The 2024 Active Data Architecture™ report leverages survey data from over 6,000 organizations worldwide to deliver insights on how companies are supporting active data architecture in the current market. Let’s take a closer look at this report.
Data Virtualization: The Needed Abstraction Layer
The report makes it clear that active data architecture is comprised of multiple technologies, so a key question on the survey is around which technology or technologies organizations are leveraging in support of active data architecture, today and in the future. According to the report, the most popular technology used today for active data architecture is data virtualization, which abstracts data consumers from underlying data sources, enabling an enterprise-wide data-access layer. The technologies that most organizations plan to adopt in the next 12 months are a semantic layer, a data catalog, and once again, data virtualization. It is important to note that semantic layers and the most effective, full-featured data catalogs can both be enabled by data virtualization.
Data virtualization also has a strong role to play in a question about which active data architecture technologies are most associated with success in BI. Data virtualization was selected by almost 80 percent of organizations that have been extremely successful in their BI initiatives, a much higher percentage than the almost 60 percent of the same group who selected metrics layers, the next-most-often-selected technology.
APIs: The Modern Data Delivery Medium
The authors also make it clear that active data architecture needs to be able to support the delivery of data to a wide range of users, in the format that they need it in, and at their required speed. In the words of the report, “Because the vision for active data architecture is that it is a pervasive and dynamic layer supporting a range of data product needs, organizations must access the core capabilities of the architecture in a variety of ways: programmatically from various systems and tools using APIs, orchestrated in any sequence or combination, and designed into business processes by using packaged workflow or business process design and execution tools.” Increasingly, APIs have opened up myriad opportunities for revenue expansion in consumer and business applications, such as banking innovations in the U.K. under the Open Banking banner.
Perhaps not surprisingly, most organizations surveyed (88 percent) considered the “programmatic delivery of data over APIs” to be of primary importance, followed by the “arbitrary execution of sequences and combinations of capabilities” (77 percent), and “execution via workflow tools” (75 percent).
The Denodo Factor
At Denodo, we’ve been adhering to active data architecture for over 25 years, since we first introduced, and then strengthened, the data virtualization foundation at the core of our platform, though we used other words to describe it. But “active data architecture” fits us perfectly, as we’ve evolved since our early days to support semantic layers, full-featured data catalogs, and the secure, governed delivery of data over APIs.
Additionally, most organizations surveyed in the report indicated that they would seek active data architecture from a data integration vendor rather than a vendor of BI/Analytics, data mesh or data fabric, cloud infrastructure, or databases. This is also fitting, as Denodo started out as a data integration vendor, before branching out into the larger sphere of logical data management.
Active Data Architecture: Read the Report
Please read the executive summary of the report or the full (93-page) 2024 Active Data Architecture™ Report from Dresner Advisory Services (both are complimentary).
- Active Data Architecture: The Need of the Hour - October 3, 2024
- Denodo Recognized as a Leader in Enterprise Data Fabric Evaluation by Independent Analyst Firm - February 29, 2024
- Enhancing Regulatory Compliance through Logical Data Management - August 17, 2023