BLOG – July 2025

Why wealth managers need proactive data quality management

by | Jul 10, 2025

DCI author Nick ThackerAUTHOR: NICK THACKER
The hidden costs of bad data

Data quality issues are the “sand in the engine” of financial operations – and they’re costing all wealth management firms far more than they realise.

In a recent presentation to Personal Investment Management and Financial Advice Association (PIMFA) members, DCI co-founder Simon Ray outlined how poor data quality creates a cascade of operational inefficiencies and why firms’ traditional approaches to data management are coming up short.

The real cost of data problems

Poor data quality has debilitating – often unrecognised – financial impacts.

Data observability provider Monte Carlo recently estimated that data teams spend 30%-40% of their time handling data quality issues instead of working on revenue-generating activities. That is a colossal waste.

It follows research by Dun & Bradstreet in 2015, which calculated that inputting data correctly from the outset to prevent data issues would cost $1 per record. Identifying and resolving poor data immediately upon discovering an error took $10. Resolving a problem that has been buried in a system for some time cost $100, as minor data errors mushroom into major problems in a “butterfly effect”. A decade on and the problems, and associated costs, will only have got bigger.

Such costs rarely appear in any P&L statement. Yet they represent a significant hidden expense for wealth management firms.

As do risk issues.

Failing to send income out to a client for the third month in a row breeds distrust, damages a firm’s reputation and incites clients to move their money elsewhere. Then there are the compensation slush funds, set up to pay clients when things go wrong. A straw poll we conducted of wealth managers found they were spending an average of 0.80% of turnover on client compensation – a truly terrifying figure.

Current approaches fall short

The biggest cause of bad data is human error. Manual processes remain common across the industry – whether during client onboarding, entering transactions or market trades, setting up new policies, reconciling with counterparties or generating reports. Any time you have manual input mistakes are inevitable.

Inexperienced staff and poor training exacerbate the problem. New employees are often unaware of the critical process steps they should take or what the appropriate values should look like, and have inadequate knowledge to spot errors. Procedures may be inadequate, or staff simply aren’t following them properly. Turnover in the industry aggravates the problem.

“Four eyes” or even “six eyes” checks are widespread. But they are expensive and unreliable, creating massive resource waste while still missing errors.

Some firms have opted for build-your-own solutions using in-house extracts, Microsoft Power BI, Excel spreadsheets and the like. Yet staff often don’t read the information that gets emailed around, creating a false sense of security while problems persist undetected.

The DCI approach: a rules-based golden source

Like a good football defence, DCI has developed a rules-based system that transforms data management from reactive scrambling to proactive problem prevention. The system works by:

  • Extracting relevant data from the user firm overnight.
  • Running comprehensive automated checks against predefined rules.
  • Providing fully audited workflows the next day of every data item that needs to be resolved.
  • Detailing the status of tasks through a clear, easy-to-use dashboard and producing reports for management.

The checks range from basic validations (to see if there are missing National Insurance numbers, for example) to examining accrued interest rates and pay dates on complex fixed interest securities, or checking ESMA transaction reporting data with a firm’s to ensure what it is reporting is correct.

The results provide staff with full visibility of the issues that need to be remediated, as well as giving auditors and regulators a complete history of what is going on. The rules also contain transparent explanations of what they are about, every work item they find and how to fix them, which becomes a valuable form of training for employees.

The clean, clear and consistent data can then serve as a cross-enterprise, cross-system golden source used to power all downstream processes – a reliable source that will be even more critical as firms expand their use of AI.

In addition, removing the sand from the engine generates important operational gearing, since firms no longer have to waste time unearthing and resolving data issues. It creates more capacity within your existing architecture, enabling you to grow your business without needing to add more operations staff.

The bottom line

Data integrity isn’t a project – it’s a business-as-usual process that requires daily attention. But it brings a healthy ROI. By investing in proactive data quality management, firms are not just enhancing their operational efficiencies, they are building strong foundations for future growth and innovation.

ShareShare on Linked In