Market data is one of the few costs that can reach seven figures, sit quietly in the background, and still be treated as “just the cost of doing business”. That’s not because leaders don’t care about efficiency; it’s because market data behaves like infrastructure. It underpins trading, research, risk, pricing, performance reporting, client servicing, investment decision-making, and when something touches that many workflows, the default assumption becomes: teams need access, so we provide it.
Over time, that assumption hardens into routine. A terminal here, a feed there, a specialist dataset for a desk, another provider because someone needs coverage in a region or asset class by Friday. None of these decisions are irrational in isolation. The problem is what happens at the system level: entitlements sprawl, contracts proliferate, and spend fragments across budgets in a way that makes the whole estate difficult to see, let alone manage.
When the question finally surfaces — “How much are we actually spending on market data?” — organisations often discover that the honest answer isn’t a number. It’s a collection of partial truths. Finance can assemble invoices and cost centres, but can’t tell you what value the spend produces or whether licences are being used. Procurement might hold many contracts, but not the ones bought under team budgets or in regional offices. Business teams can describe what they rely on today, but rarely have visibility of what the rest of the organisation already pays for. Data teams can talk platforms and pipelines, but are not always connected to commercial entitlements and licensing rules.
Everyone has a piece. Very few organisations have the system.
Why this is becoming urgent now
For years, organisations tolerated market data sprawl because it felt safer than slowing teams down. That tolerance is disappearing, but not because market data has changed – because the context around it has.
A few triggers keep showing up:
- M&A and integration. Two firms become one and suddenly you inherit duplicate contracts, overlapping entitlements, and teams with their own buying habits.
- Platform modernisation. Enterprise data platforms and cloud analytics… these don’t just change how you store data; they change the economics of access. Redistribution becomes possible in a controlled way, which means “buy another seat” stops being the only answer.
- Audit pressure and non-display risk. Vendors are sharper. Compliance expectations are sharper. “We think we’re fine” isn’t a strong position if you’re asked to prove it.
- Budget pressure. When leaders are funding AI, resilience, and transformation at the same time, market data stops being a background cost. It becomes a line item worth interrogating.
The result is that market data, once treated as BAU, is now colliding with strategic priorities. And that’s when the cracks start to show.
The problem leadership rarely sees (until it’s forced into view)
One reason this stays hidden is that no single function owns the full picture. Each group sees a valid slice of reality:
- Finance sees spend, not usage.
- Procurement sees contracts, not processes.
- Data teams see platforms, not entitlements.
- Business teams see needs, not duplication.
This isn’t negligence. It’s structural blindness and a by-product of how large organisations work. And because market data is essential, it often escapes the scrutiny applied to more visible transformation programmes.
There’s a second reason, too, and it’s harder to say out loud: market data is often politically charged. If a dataset is treated as “our edge” or “our budget”, centralisation can feel less like sensible governance and more like an attack. When that dynamic exists, internal reviews rarely fail because the analysis is hard; they fail because the organisation can’t agree who gets to define “the truth”.
What typically goes wrong inside enterprises
If you’ve worked in a large organisation, the pattern is familiar.
Market data is bought to solve immediate needs, and then it becomes embedded. People build spreadsheets, workflows, client reporting, and risk checks around it. Once that happens, even small changes feel risky – not because the organisation can’t cope, but because no one wants to be the person who breaks something important.
So the estate becomes sticky. And in sticky estates, a few things tend to happen:
1) “Local optimisation” beats enterprise value.
A desk gets what it needs quickly, under its own budget, and that feels like success. But the organisation loses bargaining power and visibility.
2) Contracts don’t live where people think they live.
Procurement has some. Legal has others. Finance has invoices. Teams have PDFs in inboxes. You can spend weeks simply finding the truth.
3) Usage data is hard to access or isn’t trusted.
Some providers make it easy. Others don’t. Sometimes the data exists, but no one believes it because it’s never been reconciled with entitlements and real workflows.
4) Ownership becomes personal.
“We pay for it, therefore it’s ours.”
That mindset is common and it’s one of the reasons duplication persists even in well-run firms.
5) The renewal clock quietly controls what’s possible.
The biggest savings are often trapped behind renewal dates, notice periods, and commercial constraints. Miss the window and you’re paying for another year, even if the waste is obvious.
None of this is a character flaw. It’s what happens when essential services grow organically without an operating model to match.
The missed trick: market data is a data management problem
When market data spend gets attention, the default response is a contract review. That helps. It just doesn’t get you far enough.
Market data optimisation sits at the intersection of three things:
- Commercial terms — what you’re contractually allowed to do
- Entitlements — who can access what
- Distribution architecture — how data flows and gets reused across the organisation
If you only look at one of these, you will miss the bigger picture.
A contract-only review tends to assume “licence = user” and treat access as static. A usage-only review can spot dormant seats but miss what the contract allows (or prohibits). And if you don’t look at architecture, you’ll assume the only way to scale access is to buy more licences.
That’s how organisations end up paying three times:
- once in licences,
- again in duplication,
- and again in internal workarounds.
The savings don’t come from squeezing vendors alone. They come from how data is governed, accessed, and redistributed.
What you see when you connect the dots
When you take an evidence-led view — bringing spend, contracts, usage, and process knowledge together — patterns emerge quickly, and they’re rarely subtle:
- licences paid for but not used,
- the same datasets bought by multiple teams,
- overlapping vendor products doing the same job,
- contractual rights (especially redistribution rights) that exist but are not being used,
- and plain ambiguity: different numbers in different places, none of them trustworthy.
This is also where risk comes into view. Many organisations discover they’ve been operating on assumptions: about what constitutes “display” vs “non-display”, about whether data can be redistributed internally, about whether shared logins are permitted, about how entitlements map to modern platform usage. Those assumptions might be right. They might not. Either way, they should be explicit and provable.
To make “evidence-led” tangible, here’s what a focused review should be able to produce, quickly.
In ~4 weeks, you should be able to get:
- A validated spend baseline (what you’re truly spending, by vendor and category)
- Entitlement vs usage gaps (what you pay for vs what’s actually used)
- A duplication/overlap map (where teams buy the same or similar data)
- A renewals calendar with notice deadlines (so “we missed the window” stops being the excuse)
That set of outputs does two things. It gives leaders a credible, defensible picture of the opportunity. And it creates a roadmap that turns “we think there’s waste” into “we know where it is, and we know what to do next”.
Why this often delivers value faster than other data initiatives
Most senior leaders have been burned by data programmes that promised transformation and delivered PowerPoints. So scepticism is healthy.
Market data optimisation behaves differently for one simple reason: it’s largely diagnostic before it’s transformational. You’re not trying to redesign the whole organisation on day one. You’re trying to get to truth, quickly, and then act where the evidence is strongest.
It also produces financial signal early. When you identify unused entitlements or duplicate buying, you are not debating future value — you’re identifying current leakage. That’s why this work often funds itself and builds momentum for the wider data strategy rather than competing with it.
For a CDO, that matters. One of the hardest things about leading a data agenda is proving value in a way the business trusts. Market data is one of the rare areas where the value is measurable, the outcome is understandable, and the money can be real.
Centralisation without the compliance headache
At this point, the word “centralisation” tends to trigger resistance — usually from people who hear it as “control”, “bottlenecks”, or “compliance theatre”.
That’s not what good looks like.
The goal is not to put everything in one place or give everyone access to everything. The goal is one governed supply chain, with controlled distribution and clear ownership, so access is deliberate, auditable, and designed around how teams work.
This is where architecture and commercial terms matter together. Some redistribution is permitted, some isn’t. Some access patterns are sensible, others create unnecessary cost or risk. A governed approach doesn’t flatten nuance; it makes nuance manageable.
Done well, this reduces friction rather than increasing it. It replaces informal workarounds with intentional design. And it makes it easier to reuse data safely, which is where both savings and capability come from.
From sunk cost to strategic enabler
The strongest version of this story isn’t “we cut spend”. It’s “we redirect it”.
Once market data is treated as a managed asset rather than an unavoidable expense, you create a flywheel:
- You reduce waste and duplication.
- You reinvest savings into the platform and governance that make reuse easier.
- You reduce the need to buy more licences just to scale access.
- You free up budget for analytics and AI work that actually moves the needle.
This is how market data shifts from a background cost to a strategic input that becomes safer to distribute, easier to reuse, and more valuable over time.
Why a different lens matters
Many reviews stop at paperwork: contracts, seat counts, invoices. That’s a start, but it rarely delivers sustained change because it treats market data like a procurement line item.
A data management lens treats it like what it is: a data estate. It asks how data flows, where it’s reused, what the contract allows, what the platform enables, and which access patterns are actually necessary for the business.
The value isn’t the report. The value is clarity leaders can act on, and a roadmap that connects commercial terms, entitlements, and architecture into a real plan.
A quiet question for leadership
If your organisation spends millions on market data…
if that spend is fragmented across teams…
and if it’s rarely questioned because “people just need access”…
…then there is almost certainly value being left on the table.
And if this feels familiar but you’re not sure where to start, that uncertainty is usually the signal to look closer, and to speak to people who approach market data as a data problem, not just a commercial one.

Sean Russell
Managing Principal at Ortecha
Gabriele Dubosaite
Consultant at Ortecha
JUMP TO A SECTION
- Introduction
- Why this is becoming urgent now
- The problem leadership rarely sees (until it’s forced into view)
- What typically goes wrong inside enterprises
- The missed trick: market data is a data management problem
- What you see when you connect the dots
- In ~4 weeks, you should be able to get:
- Why this often delivers value faster than other data initiatives
- Centralisation without the compliance headache
- From sunk cost to strategic enabler
- Why a different lens matters
- A quiet question for leadership