Euroclear reduces duplicate data spending with Collibra

Pierre Delville is the Chief Data Officer of Euroclear, an international financial markets infrastructure company that settles securities transactions and manages (in 2020) nearly 33 trillion euros in client assets. The total value of security transactions settled by Euroclear in 2020 was nearly 900 trillion euros, or 12 times the world gross domestic product for the year.

As Pierre and the Euroclear team looked for ways to eliminate data duplication and to improve the quality of data, they also recognized an opportunity for significant cost savings. The Harvard Business Review estimated that the time wasted dealing with flawed data cost the US $3 trillion per year. The paper describes “the hidden data factory”, where 50% of the time of knowledge workers is wasted hunting for data.

To address this problem, Pierre recognized the need to answer some important and difficult questions. 

“Does your company know how much it spends on data? How much are data products costing – data services, both internal and external?” Pierre said, “How are your data costs linked to consumption, storage, usage, handling and maintenance? Is it cheaper to hold your data in mainframe databases, SQL server data warehouses, Hadoop data lakes?”

Rather than looking at their enormous amount of data as a challenge to be tackled, Euroclear recognized that the data they owned should be treated as a business asset, like clients’ financial assets, infrastructure, or staff. Harnessed correctly, data can drive operational efficiency gains leading to cost reductions. Answering those questions would help quantify those benefits to the business.

Quantifying the cost of data

A key part of maximizing the value and utility of their data would be to minimize or eliminate data duplication. 

The team recognized that not all duplication is necessarily bad. “Duplication is often necessary for business continuity and business resilience. It can be necessary for regulatory compliance,” said Pierre, “But if its necessity cannot be confirmed, it should be eliminated. Unnecessary duplication results in expensive and avoidable reconciliation efforts down the line.”

First, Euroclear teams tried to quantify (and in some cases estimate) the cost of data, “Direct vs. indirect, variable vs. fixed, OpEx vs. CapEx. Covering the cost of infrastructure, data centers, storage, compute network, the cost of platforms, databases, middleware, mainframe. The cost of data vendors, facilities, business processes.” It was a huge and complex task, as for a lot of these costs no solid reporting or hard figures were readily available, short cuts and estimates had to complement available information.

Beneath all that, Euroclear teams identified inefficient business processes as the big cost multiplier. “Wasted time to find, understand, extract, reconcile, clean data. The cost of wasted time handling data is the largest single item in the cost of our data in our company. And that cost is driven notably by duplicate data and duplicate data usage.”

To begin the process of duplicate data elimination, they turned first to the two million rows of data in their securities database.

“We clarified master sources, starting with key data. We made that information transparently available to all employees,” said Pierre, “That made it possible for current and future data consumers to know in a faster and easier way what exists, what it means, where it is.”

It creates the conditions for a reduction in duplication which can lead to a decrease in the time wasted in reconciliation efforts. It also simplified the understanding of subject matter experts regarding the securities database contents. This streamlining of data definitions increased the value of the data as a business asset. 

Pierre and his team continue to identify the key data relevant to Euroclear’s core processes. Using Collibra to catalog this data, they expand the scope to include support functions (such as HR, Finance and Compliance) and very importantly sales and product development teams. They are currently implementing the data sharing agreements that make the elimination of duplicate data possible. 

This exercise is providing the insight into Euroclear’s data environment that will allow them to streamline their data, eliminate duplicate data and reduce the duplication of reports and other data output. 

Starting on the path of data modernization 

A great deal of hard work goes into such a large-scale data transformation. The effort requires strong policies and processes, training, coaching, monitoring and support. “For instance, the standard rules to name and define business terms or using one single central tool, like Collibra to document and share data traceabilities.”

But data modernization also needs to focus on the people, not just the technology. “A good data transformation must take into account the change of values and culture, the new mindsets, the new behaviors, the new communities,” said Pierre, “As we started to roll out Collibra in production one year ago, we have kept an eye on user adoption and employee awareness. And today, one year later, between 5 to 10% of the staff are already active users. This is only a start, of course.”

Pierre and his team at Euroclear regard the benefits from their data transformation as the new “business as usual” way of working and are embedding that mindset throughout their whole company. 

“It is something that will quietly revolutionize the way the company operates across divisions and entities.”

Want to hear more about Euroclear’s story?

Watch Pierre's data citizens presentation

Want to hear more about Euroclear’s story?

Watch Pierre's data citizens presentation

More stories like this one

Nov 26, 2024 - 5 min read

The fading flame: Why data governance under BCBS 239 needs your attention now

Read more
Arrow
Nov 21, 2024 - 4 min read

Celebrating our community at Data Citizens On the Road

Read more
Arrow
Nov 20, 2024 - 3 min read

Beyond technology: Collaborative AI Governance and improved assessments

Read more
Arrow