Some say that data is the new “black gold,” but I believe that just like crude oil, data has little value until you extract it, refine it, and put it to use.
In this post, I will share some of the insight gained by a group of leading data and analytics experts in financial services over an executive dinner held in late February, as we explored why so few transformation projects fully meet their stated objectives.
Financial Services and the Transformation Agenda
If you are in the business of financial services, you are also in the business of data; financial institutions depend on timely, accurate data to make the decisions that support every aspect of their operations.
To succeed in this modern world that can be characterized as volatile, uncertain, complex, and ambiguous (VUCA), organizations need to ramp up their transformation aspirations – they need to become more agile, resilient, secure, and performant, while simplifying processes and technology, and all without driving up costs or increasing time-to-value.
At dinner, we broadly agreed that transformation projects rarely delivered on the envisaged business benefits (possibly fewer than 10 percent), and we concluded that this could be the result of several challenges, including:
- The lack of a clear vision on the reason for change, or of the desired outcomes
- Changing course mid-implementation to accommodate new objectives
- Knowledge about how the business works not being widely understood across business silos
- Different stakeholders not understanding the challenges faced by other stakeholders within a complex (and often international) environment
- Concerns about the perceived risks of making the change, especially in disrupting live operational systems, including “legacy” systems
- The cost of change; for example, predicting the cost of cloud operations, but without the ability to articulate the value of the new state
- Making the data needed for new systems available in a timely, secure, and effective way
That said, successful transformations promise remain faster, cheaper, more reliable, resilient, and effective processes that deliver more personalized customer and employee experiences while also creating new opportunities for innovation and growth.
Data Is at the Heart of Transformation
There has never been a greater need for financial services institutions to stand out from competitors, as the sector becomes more ‘“commoditized,” or when goods and services look broadly like those of competitors and lack competitive differentiation. One way to be different is to utilize the value of the organization’s own data; data that is unique to the organization.
It is easy to focus on the internal benefits of transformation; processes that are faster and more efficient than their predecessors, but that is rarely what customers notice and is increasingly what they have come to expect anyway. What does get the attention of today’s customers (and employees) are new capabilities that are relevant and personalized to their needs, but to deliver that, organizations will need to leverage data in new ways.
One of the biggest challenges faced by organizations on a transformation journey is where to locate the data that is needed and how to make it available to the people and processes that need it.
Nonetheless, the ability to create more focused, complex, personalized, and tailored financial products for customers is often hampered by a fundamental lack of insight; too often the data is locked up in spreadsheets or held in functional silos, and therefore not available for either new operational systems or for the analytics at the core of personalization. Data integration can be labor-intensive and prone to error. As one senior risk director at the dinner said:
Too much of what we do is still manual – data has to be manually re-entered from one system to the next.”
Furthermore, traditional methods of integrating data, such as extract, transform, and load (ETL) processes, are time-consuming and expensive to implement and maintain, especially when integrating data from multiple sources, such as cloud, on-premises, third-parties, and so on. Data architectures have become very fragmented and distributed, and the prospect of creating yet another data silo is daunting.
Yet another challenge that traditional ETL approaches struggle with is the need for agility (not to be confused with the Agile methodology). Volatility in today’s world can be understood as rapid changes that were not predicted. So, data requirements will also change to follow changing needs. The need for agility also implies that the need for change is perpetual, addressing new opportunities and challenges as they arise.
Providing Data that is Accurate, Timely, and Trusted
A common complaint is “I have an ocean of data, but barely a cupful of insight that I can actually use.” This can be caused by a number of factors, such as how the data was collected, how long ago, the accuracy of any calculations made on the basis of it, and the perspective of the user.
At the dinner we discussed how to reduce the amount of time wasted debating the veracity of a data item, rather than understanding what it means and what should be done about it. For instance, it is not uncommon for all the attendees at a meeting to have different values for the same thing (e.g., the number of current customers). What is needed is to show the lineage of the data; with this transparency comes trust. It also means that it is easier to explain to regulators and stakeholders how data is sourced and used. As one chief data officer put it:
We need to be able to explain what and how data is used, even as we transform the way we use it.”
We also noted that this can remove the ability to “game the numbers”; simply put, the number becomes the number, budgets can be based on actual costs or targets, objectives can be clearly stated, and progress can be measured. The result is an organization that becomes used to working with the unalloyed truth.
Using Data to Empower Staff
Many of those around the table were exploring the use of tools like machine learning (ML) and artificial intelligence (AI) to support self-service and carry out many of the repetitive tasks that used to be done by staff. What is more, those same employees then have more time to focus on dealing with strategy, innovation, or other valuable work that only humans can do.
However, getting access to appropriate data that is secure and well-governed, while still complying with regulatory and compliance demands has been a challenge.
The Denodo Platform enables organizations to address the challenges outlined in this post. By working at a logical level, abstracted above monolithic architectures, it’s possible to create the agile, secure, and governed infrastructure that many financial institutions desire, and even position the enterprise for a data mesh, if that is anticipated.