GoldenSource Blog
Wave form bridge building with data vendors part 6

Bridge-Building with Data Vendors – Part 6

As promised, the next installment in my data vendors series, this week on why proactive change management is essential when working with your vendors.

It’s really all about staying ahead of shifts in regulatory requirements, emerging asset classes, and evolving data types (and not having to scramble to keep up).

Just like other players in the financial ecosystem (and in fact any firm that produces and sells goods), data providers are continuously refining their offerings. They’re addressing quality issues flagged by clients, enhancing datasets and delivery technologies, and rolling out tactical improvements and back fixes.

They’re also developing new products (i.e. datasets), as well as restructuring existing ones.

The pace of these changes varies—powerhouses like Bloomberg may update frequently, while niche vendors tend to have products that evolve at a lower pace. But regardless of frequency, change is part of keeping their products competitive.

Even so, while clients seek rapid improvements to specific datasets, they typically prefer everything else to stay the same. This puts data vendors in a delicate position—improve fast enough to add value without disrupting their customers.

That’s why it’s critical for data consumers to be ready. To take a proactive approach, here are three key ways to stay prepared:

1. Clear, Open Communication Channels
Establish reliable, dedicated lines of communication with your data providers —support desks, primary contacts, and escalation paths in both directions—to ensure you’re never caught off guard.

2. Understanding Change Cycles
Know how each data product is managed. What changes are planned? How are you notified? What other data products might be impacted?
This is especially vital when working with large vendors like LSEG or ICE, which oversee numerous, differently-managed feeds.

3. Comprehensive Information Exchange
Ensure agreements are in place to receive critical information like data dictionaries, sample files, or change documentation to support testing and integration.

By building a flexible but standardized operating model, your organization can efficiently onboard new data products and providers (commercial and not-for-profit alike), applying a proven process rather than reinventing solutions each time. It’s a resilient, “we’ve seen it all” strategy for avoiding surprises and minimizing risk.

At GoldenSource, we’ve implemented a three-stage operational model built around this concept, and I’ll be sharing the details with you in next week’s installment.

All Posts