It’s an interesting question. After doing extensive research on AI readiness, I found myself trying to understand what the experts had to say on the matter. One surprising statistic from the Rand Corporation is that 80% of artificial intelligence (AI) projects fail. Similarly, a study from Deloitte shows that close to 70% of generative AI (GenAI) projects stall in proof-of-concept, lacking the ability to scale. According to a Wharton study, over 50% of executives believe that they are not ready for AI. This suggests that there are few true experts when it comes to real-world AI implementations, especially at scale. However, reflecting on my own experience leading a large-scale AI initiative, I realized that perhaps my insights might help others to assess their AI readiness.
Are You Ever Truly Ready?
The concept of readiness can be tricky. It reminds me of questions we face in life: when to change careers, have children, or get married. We often think we’ll know when we’re ready, but in reality, readiness is often only clear after we’ve taken the leap. When I had my first child, I thought I was fully prepared. I bought the safest stroller, and I even opted for cloth diapers.
But once life kicked in, I realized that no matter how prepared I thought I was, there were many things I couldn’t have anticipated—like the bulky stroller that barely fit in my car or the impracticality of cloth diapers for my lifestyle. The same is true for AI. Preparation is important, but over-preparing can paralyze you. The key is to stay open to change and embrace the iterative nature of innovation. AI is a journey, not a one-time decision.
A Journey into AI
In 2015, I had the opportunity to be part of an exciting AI journey. This was before MLOps, DataOps, and other related “Ops” were around. That was the era of big data, or the inception of this current data and analytics era. This was when companies started looking to get more value and competitive advantage out of data. This was when organizations that invested in AI at scale were pioneers, and the failure rate was probably even higher than the 85% projected as recently as 2022. My colleagues and I were embarking on something exciting, yet we were full of uncertainty, as this our entrée to AI as a business transformation agent.
At that time, I was working for a global leader in the food flavor industry, and I was given the opportunity to be the technical lead for this initiative. Our leadership wanted to use AI to enhance our product development, so we could stay competitive in the market. We wanted to create an AI system that could work alongside developers, helping them to be both more creative and faster. This vision seemed far out of our reach, and by many accounts, we were not “ready.” But we had a mission and leadership support to make it happen. Many companies that attempted similar AI initiatives were experiencing large losses due to failed projects. Today, unfortunately, the story remains much the same, as many AI and digital transformation projects are struggling.
Despite the challenges we faced, we successfully implemented our AI project within a year of starting development, and we continuously enhanced and adapted it over the years. As I reflect on our journey, I’m reminded of the key drivers of our success:
-
Choose a Project that Demonstrates Clear Costs and Value
When tackling your first AI project, be prepared for changes to your business processes, ways of thinking, and the costs that come with adopting new technologies. Building and implementing transformational AI is about transforming aspects of your enterprise. The value can be improved competitiveness, increased customer intimacy, extended global knowledge and expertise, better quality products, optimized processes etc., all of which require varying methods of measurement. Understanding the value AI can bring, and how long it will take to realize that value, is crucial. You should also carefully weigh the costs and benefits of external expertise against in-house resources.
AI projects must be guided by a clear understanding of ROI. Build mechanisms into the project to measure the impact and ensure that value is realized as expected. Using a logical data fabric enabled by the Denodo Platform (see #4 below), we were able to gain continuous insight into many measures such as user adoption, adherence to processes, development cycle times, product assessment, material usage, products inspired by AI, sales and much more.
We were also able to review and measure how well our users were following business process guidelines. Our users understood that it was important to understand data as you produce it, if you want your AI models to learn properly. Our logical data fabric empowered our data scientists and analysts to quickly adapt metrics and measures as the project progressed. We were able to tie these metrics to goals so managers and end users could have insight on-demand.
We were also able to measure the output from our models by running simulations in which the data was immediately accessible for evaluation. This helped us to guide how we could enhance and improve our models over time.
These types of projects are iterative in nature, and to gain business value from them, you have to make them part of your business processes. Measuring the many aspects of such a project are therefore crucial for success and continued improvement.
- Leadership Support Is Critical
When leadership is fully behind an initiative, they can clear obstacles, allocate resources, and drive progress. Leadership is also critical in driving focus on the mission and scope and evaluating when either might need adjustment. Leadership helps with communication and driving change management throughout the organization, to all areas touched by the transformation. Their commitment makes a huge difference in moving projects forward and enabling their timely success.
- Foster a Collaborative, Multidisciplinary Team
Our project’s success was largely due to our cohesive, highly collaborative team. We brought together leadership, external AI researchers, business domain experts, change management professionals, and IT specialists. These diverse perspectives and skills were essential. The business experts shared their knowledge of the industry, while IT provided the necessary infrastructure, technology and data products. AI researchers translated this data into models that met our business goals. Frequent, rapid collaboration enabled us to move quickly while keeping everyone aligned on our objectives. Our communications were in the language of the business especially when data was involved.
- Leverage Abstraction Through a Logical Data Fabric for Agility
In my experience, there is an overwhelming belief that the biggest challenge with AI lies in model creation. It certainly has its fair share of challenges, as talented AI researchers must translate business processes into mathematical models and interpret them. But for this activity to even begin, someone must translate activities and processes in the enterprise into understandable components that reflect them. The task is then to continuously deliver this information, regardless of the complexity of the enterprise.
Why a logical data fabric?
One of the most crucial technical approaches we employed was leveraging abstraction through a logical data fabric. This approach enabled us to establish a unified semantic layer above our data sources, which automatically translates all data into the language of the business. We provided AI researchers and business users with consistent enterprise data models, even as our enterprise environment evolved. The idea of abstraction is rooted in software engineering, where encapsulation and modularity are key for ensuring that changes in one part of the system don’t disrupt the whole. As the enterprise data architect, I researched and sought out this logical approach because I understood that technology that empowered these principles would help us to keep pace with the demands of the project.
The fast-changing data landscape posed a significant challenge. We had to maintain the integrity of the information we shared while delivering “new” data rapidly for AI models. By creating a logical data fabric, we established an abstraction layer that enabled us to manage and deliver data on demand. This layer insulated us from changes in the underlying systems and business processes, giving us agility while ensuring the consistency and stability required for AI research. Using the Denodo Platform’s logical data fabric as our abstraction layer empowered us to maintain real-time, on-demand data access in the language of the business, despite the dynamic nature of our data sources.
This approach enabled us to meet the demands of an innovative AI project while keeping our existing system functional. Abstraction made it possible to provide consistent, reliable information, which was key to the success of both the AI models and the overall project.
Business Processes Drive Data Quality
Data quality is critical in AI projects, but it’s important to understand that data quality reflects the processes that generate it. If business processes are inefficient, inconsistent, or filled with workarounds, your data will reflect those issues. In our experience, fixing data quality challenges downstream—such as in data pipelines, extract, transform, and load (ETL) jobs, or other transformation mechanisms—can be costly and time-consuming. Instead, addressing these issues at the source, by improving business processes, is far more efficient, when possible.
This is where leveraging a logical data fabric became invaluable. By implementing this abstraction layer, we were able to give business users direct insight into the data their processes were producing. With this real-time view, business users could see the gaps and inefficiencies in their processes that they hadn’t been aware of. For instance, many users thought their processes were streamlined, but the data revealed inconsistencies and inefficiencies they hadn’t anticipated. We were also able to assess how well people were adhering to established processes so we could determine if we needed to employ appropriate mechanisms.
With this newfound visibility, users were empowered to address these issues at the process level, improving both the quality of the data and the effectiveness of their operations. The logical data fabric enabled business users to interact with the data in a way that made sense to them—reflecting their own language and workflows. As a result, they were able to make informed decisions about where adjustments needed to be made. This ability to address problems early, based on concrete data insights, significantly improved the quality of the data being fed into our AI models, ultimately providing more accurate results.
In short, the logical data fabric didn’t just provide data; it provided actionable insights into business processes, enabling continuous improvement and alignment with the organization’s operations. This capability became a key driver of both data quality and overall project success.
Agility is Key
Time is almost always at a premium, whether you use internal or external resources. The ability to be agile enables you to adapt quickly, reducing costs and time to value. Data drives AI projects and the ability to identify, agree on, integrate, and deliver it quickly, consistently and simply regardless of where it was or how it was stored was critical for our AI researchers to iterate through model development. With the abstraction provided by our logical data fabric, we were able to create contracts for data exchange, enabling our teams to work independently while keeping all parts of the project on track.
Early in the project, for example, we needed to incorporate an external data source. Using our logical data fabric, we evaluated several options and selected a service within a week. This approach also enabled us to integrate external information with our internal data efficiently, quickly turning it into valuable data products. Additionally, we could prototype new features for testing in our models before committing them to system-wide production changes. This agility in data management supported faster experimentation and iteration, proving invaluable to the project’s overall success.
Strategically Creating Reusable Data Products
During our project, we built numerous data products that represented entities within the enterprise. These products, created through the logical data fabric, weren’t just usable by the AI project—they were designed with reuse in mind. This strategic approach to data management enabled us to support other projects, such as analytics and integration, by reusing the same data products. By building reusable artifacts, we created efficiencies that paid off in the long run. These efficiencies extended to our ability to both enhance, extend and support these products.
Long Term Impact
This AI initiative made a positive impact on our organization—and it is still evolving to this day. The AI “colleague” learned to provide new, creative ideas that helped to reduce development cycles as well as product costs. Being one of the early adopters of AI induced customer intimacies and attracted new developers.
Are You Ready for AI?
Is anyone ever truly ready for it? Perhaps not entirely. But the key is to begin. Be prepared for the challenges, embrace the innovations, and focus on creating value. Leveraging abstraction through a logical data fabric can provide that agility, as well as the consistency and resilience needed to navigate the evolving demands of an AI project. Part of the journey is assessing your data management approach and refining it when necessary. Success in AI isn’t about having all the answers from the start—it’s about building the right foundation and adapting as you go.
- How Do You Know When You’re Ready for AI? - November 28, 2024
- A Deep Dive into Harnessing Generative AI with Data Fabric and RAG - September 26, 2024
- Querying Minds Want to Know: Can a Data Fabric and RAG Clean up LLMs? – Part 4 : Intelligent Autonomous Agents - August 23, 2024