Data Dilemmas

Published: March 17, 2025

Data Dilemmas
Ed Birchall picture
Ed Birchall
Partner, UK & Ireland Business Growth, Nuix
Joel Fisher picture
Joel Fisher
Senior Manager, Liquidity, Working Capital and Business Resilience, EY-Parthenon
Luke Harris picture
Luke Harris
Senior Manager, Treasury, Strategy & Transactions, EY-Parthenon
Teija Ridoutt picture
Teija Ridoutt
Head of Treasury Transformation Projects, Netceed

Why (and how) Treasurers Must Conquer their Digital Mountains

Businesses are repeatedly told that data is the new gold/oil in terms of value to its owners. Yet sources of information can often be found scattered around the organisation as if it had little or no worth at all. In this exploration of data, we will seek to provide treasurers with a solid starting point for bringing structure to relative chaos.    

Data is a pervasive and unstoppable resource. And not just the digital kind; from paper documents to the internalised knowledge of individuals, data has enormous potential power. But while it can and should underpin all elements of strategic business decision-making, the challenge today, notes Luke Harris, Senior Manager, Treasury, Strategy & Transactions, EY-Parthenon, is trying to control exponential surges of what he refers to as “the four Vs of Big Data”.

These, Harris reminds us, are: volume (the amount of data generated and stored); velocity (the speed at which data is received), variety (the different types and formats); and veracity (the quality and accuracy of the data).

As waves continue unabated, it’s no wonder many companies struggle with data management. The approach to tackling the Vs, notes Harris, is hindered by several factors. Notably, these involve a failure to approach it with the right mix of technology, resources and capabilities. And with an all-too-common lack of goal congruence and communication throughout their organisations, he observes many organisations are significantly underperforming on the data-maturity curve.

The relative maturity of functions such as treasury ultimately measures their ability to transform data into useful information. “But we often see treasury teams operating in silos in which data cannot be easily accessed and used,” comments Harris. “This affords them little or no capability to demonstrate their capacity to progress from their role as a cost-centre into that of a strategic advisory partner. This occurs particularly where liquidity and risk management are concerned, as these remain critical focus areas for treasury teams, especially through a private-equity lens where value-add activities around cash are a key priority.”

When unable to establish additional value through data, many treasuries find themselves in a Catch-22. Without additional funding, there can be little or no improvement in data management. Yet without improvement in data management, it’s very hard to demonstrate the need for additional funding.

The common ‘good-enough’ condition

At a company level, it might be tempting to think that the problem is confined to smaller firms. Not so, says Teija Ridoutt, former Associate Treasurer, Howmet Aerospace, and currently Head of Treasury Transformation Projects, Netceed.

“For a small company with limited scope, data management is easier. It doesn’t mean that they’re good at it though. At the same time a large multinational, where better IT resources might be available, might not be any better at it either.”

A common theme that she detects is the lack of a harmonised approach following M&A, for example. “Companies aren’t immediately focusing on how to bring the data together; they simply onboard the new company and then somehow start to coexist, using legacy systems and resources to get by.”

While certain functions might try to consolidate at some level, Ridoutt notes that many do not have the luxury of time or resources to be able to achieve what’s necessary with their expanding data pools. “You keep on taking data in a way that enables you to get through each day. So often you don’t have the time to go ahead with something better, you learn to live and operate with various ‘good-enough’ solutions.”

Bolt-on issues

The perpetual challenge posed by operational silos, and lack of communication between teams, remains a common barrier for effective data management, notes Joel Fisher, Senior Manager, Liquidity, Working Capital and Business Resilience, EY-Parthenon. And where multiple entities are indeed “bolted together” post-merger, he says companies are left not only with a mish-mash of disparate and detached systems but also “conflicting cultures and levels of data and financial literacy”.

As a consequence, the search for working capital intelligence, for instance, can be slow and inaccurate. Furthermore, certain functions may not be quite as invested in the building of a company-wide cash culture as working capital and treasury specialists will be, notes Fisher. “Unfortunately, those functions may be the owners of key input data, and if they don’t or won’t understand how their actions affect cash flow, they are unlikely to be as accommodating with the timely and consistent delivery of data.”

In the course of his work, Ed Birchall, Partner, UK & Ireland Business Growth at investigative analytics and intelligence software firm Nuix, notes a specific format issue. Some companies may have a good grip on their structured operational data, but due in part to the surfeit of new communication tools and channels that many embraced during the pandemic, the increasing readiness of unstructured data has amplified the challenge for businesses trying to make sense of their own world.

Platforms for instant messaging and video calls, for example, “are all highly pertinent to how people work today”, says Birchall. “But companies are finding it difficult to accurately capture and understand their content.” For it to be useful, and handled within the realms of compliance, he warns of the need to quickly capture the value and relative sensitivity of this data.

This is where the treasury’s Catch-22 really bites. Some may have access to a treasury IT team that understands treasury needs and the complexities of IT – as did Ridoutt during her time at Nokia – but she knows this is a rare occurrence.

Attracting investment

Where lack of investment is the case, Fisher urges treasurers not to plead the case for a departmental upgrade in isolation, but instead to devise a business case highlighting the potential value of that investment to the wider business. This, he suggests, could be kick-started by benchmarking analysis of working capital performance.

Through a set of comparative historical and peer-related working capital metrics, treasury could demonstrate the potential levels of additional cash that could be brought into the business following system and process enhancement.

This, notes Fisher, would be especially favourable during periods of growth, but also as a means of further mitigating risk and uncertainty around working capital and liquidity management.

“There are many other ways to demonstrate the value of enhanced data management,” he comments. “In each case, if the CFO is shown a broader financial value of treasury investment, it helps reinforce the case for that outlay.”


BOX 1: A Treasurer’s Perspective

The goal of most data projects will be to transform a complete, unique, relevant and consistent data set into timely, actionable information. Teija Ridoutt, Head of Treasury Transformation Projects, Netceed, recalls project experiences.

In one of my previous companies, we had a scenario where we had only monthly visibility over global cash and free cash flow (FCF). I started to build a tool that enabled us to have daily visibility of cash, bringing all bank statement data together in one place. While around 80% was acquired automatically, the remainder had to be manually loaded into a Power BI-linked database. It was a simple step but already one that provided greater visibility to the leadership who obtained daily cash visibility via the Power BI dashboard. We pushed the process further, adding a process to calculate the FCF changes on a daily basis. For this we collaborated with Shared Services and other finance functions to map and reconcile the elements we were able to see on the bank accounts.

For the CFO, it was a game-changer because now he had the visibility not only of the global cash balances but also the FCF. This enabled the CFO to take actions during the quarter, making adjustments or carrying out other capital contribution initiatives, because now it was evident when, and by how much, the business was going to be within its targets. Instant visibility opened up a whole new playground for treasury because it stimulated the CFO’s curiosity about what else we could achieve, which in turn generated project support.

Initially for treasury it was a manual Excel-based task, taking the team two hours a day to provide this visibility. Then we started talking to vendors about making the process easier so that we could begin deriving further value from our live bank data. We learnt how to identify the different elements on the bank account database, distinguishing between HR payments, different tax types, AP and AR transactions and so on.

By building that structure, it enables the building of the database from the source data – in this case, bank account statements – pulling it in via APIs to a data repository. I like the idea of a massive ‘data lake’, so that anyone with access – tax, HR, finance, treasury and so on - can just pull the live data they need for their own analysis.

I believe there’s much that treasury can offer using data mined and stored in this way. Being responsible for managing bank fees, we can see the massive accumulation of costs for different departments pulling the same data, often several times a day. Spread across multiple silos, there is no way for treasury to fully understand why that data is being called, and therefore no way to manage costs. With the data lake, treasury is able to communicate to the rest of the finance organisation that all the live bank data they need, or might benefit from, is available within a single location. The fee savings for deduplication alone are significant.

Perfect position

Given that treasury sits at the heart of the company and controls access to bank accounts, it also controls access to this data. Other functions may not realise how well placed treasury is to support them, so this is a great platform for treasury to start building knowledge among those other functions of the data it can provide.

Treasury is ideally placed to co-drive a financial data management initiative: I believe such an initiative should be jointly driven by stakeholders from finance, HR, tax and IT. Such initiatives cannot be driven by IT as they lack the functional perspective. I’ve been part of projects which were IT led and had significant operational consequences on the functional areas. End users in treasury and other finance teams, were brought into the project too late and significant changes, having a negative impact on our processes, were already put in motion. Take-aways of such projects were not value-adds but rather additional hurdles decreasing efficiencies and increasing frustration levels.

Treasurers themselves may argue that they have limited time and resources to take on the driving role. But from my experience, it really is worth taking on this initiative and pushing it ahead with others, because it has always paid back in terms of time savings going forward.

Data-management initiatives also open up access to a raft of new tools. I’ve looked in depth at AI and cash forecasting tools for example. I remain convinced that if treasury does not take the time to sort out its data first, the value-add of these new tools is not as immediate as one might wish. The plug-and-play approach rarely exists.

One of the fundamental questions around data management is where should it be held. I talked about the data lake idea, but this is where the IT department steps in to host and support the data. For the end user it doesn’t really matter how or where it’s hosted, as long as it’s in a place where all the different functions can easily access and use it.

Only what’s needed

There will be a huge volume of data in every company, and many great objectives could be achieved with it. But no one has time to pursue a vast vanity project, and there’s no benefit in becoming mired in the concept of ‘everyone accessing everything’. I believe that it’s essential for each user to begin by asking which data, on a day-to-day basis, will actually add most value to their working life. Focus on only what is needed for the collective initiative to gain momentum.

A treasury project may start with bank statements and an assessment of access to global banking information. Where previously I could download perhaps 80% of bank-account data automatically, I have also been in situations where the company had more than 200 bank accounts and only 15% were accessible by the treasury team. I relied on 10 different people to provide bank account balance information every day. As there was no TMS in place, I set-up a SharePoint database to collect and collate the data. This could be a simple initial step for treasurers of smaller companies. The precise starting point of their project depends on their baseline assessment, but the crucial element is that their project starts. Go for the quick wins that will forge your path towards the long-term structure you want to have in place.


Quick wins create momentum

The most effective starting point for a data-management project is to gather the requirements and list concrete use cases, says Veikko Koski, Founder and CEO, FinanceKey. It means mapping out the data, where it’s stored, how it is being used and by whom. It also demands an honest assessment of how well the data is meeting its intended purpose.

Companies must also work out where data gaps exist, relative to their goals, and how and where to plug these gaps. This requires consideration of both internal and external, structured and unstructured data.

“This exercise will help build an understanding of the overall pain the organisation is feeling, and from there, find the potential low-hanging fruits where the company can achieve the greatest value versus required effort,” he explains. “While planning, the sooner one can take action and showcase results, even with simple proof-of-concepts, the easier it will be to engage the teams behind the initiative.”

In terms of goal setting, Koski believes that teams should aim high to achieve a “golden source” of real-time finance data, but should plan “iterative steps how to get there, with each milestone bringing concrete value to the organisation”.

The need to take small, incremental project steps is echoed by Fisher. “When we talk about working-capital optimisation, we often start with the low-hanging fruit versus long-term aspirational change,” he explains. “By focusing on those quick wins, it creates project momentum; the business case for aspiration follows later.”

Educating all

Having implemented a number of cash- and capital-reporting mechanisms, Fisher understands that data efficiency is not, as might be expected, solely a technology play, and that education is crucial. A key part of the process will likely see treasury fostering deeper relationships across the business, with the most effective models involving collaboration at least between IT, Finance, and Treasury.

“IT will bring the practical application and robust risk management around the solutions being developed, while the finance and treasury functions will bring their own practical positions as to how the data and resulting outputs should be structured. Ideally, there will also be additional early support from key stakeholders who are going to use these outputs, and ultimately CFO or other senior executive sponsorship,” he explains. The latter is valuable because if certain data sources are not forthcoming, it may be necessary to encourage the revision of processes and policy to ensure they are.

Naturally, every project needs a co-ordinator. Harris suggests that the driving seat could be best occupied by the treasurer, especially those in companies with an established treasury committee, where the attention of the CFO and other functions is assured.

However, depending on the goal, a much broader approach to education could be required. This may bring departments such as procurement, AR/AP, accounting, tax, and HR into the mix. The more colleagues from different functions that are involved in the data journey, the more beneficial the solution can be to the wider organisation.

As Fisher recollects: “For one client, where cash reporting data was assisting the commercial team to manage returns on working capital invested in individual products, their involvement in the development of that working capital tool had helped them grasp the importance of cash data to the whole business”.

“Treasurers work at the crossroads of multiple systems, often feeling the most pain of all within the finance organisation,” notes Koski. He believes treasury’s inputs are crucial, bringing an understanding to the wider business of how important access to clean, non-aggregated data is, for example, for accurate FX hedging. “Data is the new gold, and the more curiosity treasurers have towards data governance and technologies, the better they will be equipped to support company-wide data initiatives.”


BOX 2: PaaS power: Platforms, Drivers and Engines

With the multitude of different connections, data types, and sources, all coming together at once, Ed Birchall, Partner, UK & Ireland Business Growth, Nuix argues a argues a case for the platform-as-a-service [PaaS] structure. PaaS, he explains, is the “engine room” behind the front-end of multiple platforms. It’s a cloud-based offering that provides a development environment for building, running, and managing applications.

Individual users will be familiar with the front ends of their own function-specific platforms. But these tend only to deliver on-screen what is required within each tightly defined setting. “PaaS can enhance and enrich those front ends by bringing together multiple data sets, with multiple language models, to analyse large amounts of data,” explains Birchall. “It presents a business with new way to look at its technology stack, and feed into multiple new use cases at the front end.”

Nuix, which operates in areas such as law enforcement, litigation, and regulation, uses the notion of data ontology to define data concepts, relationships, and properties. This enables users to more easily share and build data relationships across different systems, sources and functions.

The use of ‘contextual AI’ is a key part of this process. Here, raw data is processed within the framework of relevant properties such as source, time period, location, and the role of the data user and their professional queries. This enables the PaaS algorithm to present, via the user’s own front end, only what is needed for their task. As a simple analogy, when using Google, the more related information is added to the search bar, the more focused the response will be.

Sign up for free to read the full article

Article Last Updated: March 17, 2025

Related Content