Athena Sharma, Consulting Director and Global Financial Services Lead at Artefact, and Chris Bannocks, Group Chief Data Officer at QBE Insurance discuss data maturity – what it is, the role it plays in business today, and why it’s a particular challenge for banks, insurance companies, brokerage houses, credit unions and other businesses in the financial services sector.

What do we mean when we talk about data maturity?

In broad terms, data maturity measures the level of an organisation’s ability to create value from its data. To achieve a high level of maturity, data must be deeply embedded throughout the organisation and fully integrated into its every decision and activity. It is a key factor in successful digital transformation. The higher the level of its maturity, the more competitive advantages a company will have.

To measure data maturity, a maturity model is used to assess a company’s various data and digital capabilities. “There are many different maturity models available, but they all use a set of questions to rate a company’s level of maturity across many different dimensions,” explains Chris Bannocks. “These include a company’s capabilities – or maturity – in its architecture, people, analytics, ethics, privacy and so on. It’s an objective measure that can be delivered either by independent review or self-assessment.

“The result is an aggregation, so companies should decide where they want to acquire more data maturity and where less is acceptable. You don’t have to be at the highest level in every dimension. You have to ask yourself, ‘Maturity for what purpose?’”
says Chris Bannocks.

Why is data maturity especially important for financial institutions?

Because regulation is a crucial issue in the world of finance, the higher the level of a bank or insurance company’s data maturity, the more likely data is to be well-controlled, well-managed, well-governed and secure.

“Data such as risk metrics, client statements or financial reporting may be deemed more accurate as a result of data maturity in those areas,” says Chris Bannocks.

But regulation isn’t the only driver for data maturity. A data mature financial institution (FI) can leverage the full spectrum of solutions that big data analytics and AI have to offer, leading to better decision making, a more connected business and greater competitive advantage. For example, a more data mature insurance company would be better at using AI to determine personalised insurance risk for each customer, leading to better-tailored policies and significant business benefits.

“Leveraging AI and big data analytics for data driven decision making, enabling greater personalisation and optimising processes is a critical success factor for banks and insurance companies. But overcoming legacy data maturity challenges continues to be a primary blocker,”
says Athena Sharma.

Where do financial institutions currently stand on the data maturity scale?

Many FIs still approach data maturity as a zero sum game – either you are data mature or you are not. Instead, data maturity should be viewed as a spectrum with varying degrees of maturity being appropriate for different parts of the business, says Athena. “This segmented view of maturity makes the data challenge more digestible for FIs, enabling them to target priority areas first and trading off maturity in areas that may not be as important for the business, or that may generate lower ROI.”

According to Chris, on a data maturity scale of 1 to 5, most financial institutions sit somewhere between 3 and 4. “FIs have made great strides because they’ve been hard at work in the maturity and capability space for about 15 years. But if you dive down and look at data quality or architecture or ethics, you’ll see varying levels of maturity according to industry.”

What challenges do financial institutions face in the data maturity journey?

The complexity and age of legacy architecture and systems remain one of the primary challenges faced by banking and insurance companies. Of the 100 global leading banks, 92 still rely on IBM mainframe for their operations. Using an outdated system means a lack of agility and architecture that isn’t able to cope with growing workloads, especially when it comes to Big Data.

“As a result of these challenges, FIs need to either grow their processing capacity or rebuild their existing architecture, both of which are high effort and time intensive solutions. This is in sharp contrast to nimble FinTechs which can remain agile and customer centric from the get-go,”
says Athena Sharma.

The second key challenge relates to people and organisational complexity. Behavioural change and adoption are hard to achieve across industries, not just financial institutions. But the organisational complexity of FIs makes it harder to drive data accountability and ownership and deliver business value.

According to Chris, “It may be easier for people in banking and insurance groups to change behaviour as they’re used to regulated environments and can adopt things like data governance more easily than those in other industries might. But the size and complexity of the organisational structures themselves may present different demands from a data maturity perspective: a retail bank has a different maturity need than an investment bank. The difference may be minor, but it requires that the response to each project be tailored to each case in terms of business value, not just regulatory value.”

What steps can financial institutions take to become more data mature?

The broad answer is to accelerate towards analytics. Here are some steps FIs can take to achieve this:

  • Assess the foundations: FIs should start with a comprehensive audit of their data foundations, not just by assessing technical capabilities but by also asking the right business questions. “In most cases, the existing foundations are good enough. Once financial institutions begin to put data maturity measures around them, they’ll incrementally improve,” says Chris.

  • Pilot the right use cases: FIs need to adopt an agile approach where they “change by acting now, rather than acting after change has occurred,” says Athena. In practice, this means selecting use cases that address specific business issues, outlining requirements to enable these use cases and changing data maturity parameters in line with these requirements. This makes change more manageable and effective.

  • Realise value early to maintain momentum: A key setback is that large scale transformation projects lose momentum over time and value realisation takes too long or sometimes doesn’t take place at all. A use case-driven agile approach to data maturity enables early value realisation, and therefore, greater business buy-in.

  • Keep people at the heart of change: According to Chris, “Data maturity has to include and understand the human component, as well as the consumption and use of data within the model, not just the way you manage it to its endpoint. To get real value from a data maturity model, all these components must be included.”

Artefact Newsletter

Interested in Data Consulting | Data & Digital Marketing | Digital Commerce ?
Read our monthly newsletter to get actionable advice, insights, business cases, from all our data experts around the world!