GoldenSource Blog
Operational Data Store

What Has Your Operational Data Store Done for You Lately?

An operational data store (or ODS) holds and delivers the best available instance of a data element at any given moment. It is the operationally authoritative source of data.

As a near-real-time system, the operational data store is used to make gold copy data sets instantly available. As well as publishing data to downstream systems it can also drive standard operational and event-driven processes. And the enduring feature of a financial services operational data store that manages a full range of both reference static (e.g. securities, entities) and dynamic data (e.g. positions, balances, ratings, prices) is that it maintains relational links to data from different domains, giving business users almost unlimited freedom to simplify and control complex processes, using natively pre-related data elements.

Mastering and the Operational Data Store

Mastering data is the process of ensuring that data received from a system or source (e.g. a data vendor, custodian) is fit for purpose. What determines whether an operational data store must also be used to master data, are the system landscape, data architecture, operating model and levels of trust in the quality of data being received from a firm’s suppliers. Mastering typically involves validating, cleansing (or scrubbing) and normalizing data received from one or more sources. The capability of a single system to master and store gold copy data is part of what makes it an enterprise data management (EDM) platform.

Flexibility is key in adding value when and where it is needed. If there isn’t faith in the completeness or accuracy of a data feed, or if multiple feeds are utilized, then mastering is required in order to construct or curate the gold copy, or maybe multiple gold copies, used for different purposes. An operational data store that includes a data quality framework can also provide feedback to master systems, thus serving those systems, improving data quality and adding value to the data.

Some asset managers might trust the single feed for pricing data that they receive, in which case no data mastering is required and the ODS ingests, relates and makes the data available as the gold copy. Similarly, a bank that still works with multiple data mastering systems might bring all its reference data together in an operational data store, where it serves as a hub to the masters. Perhaps it becomes the central system in which amendments, updates and corrections are made, before these are communicated back to the relevant masters. A brokerage business might take a hybrid approach for managing sub-ledgers, where the operational data store is the master for the positions but serves other mastering systems for data related to P&L and risk.

Data Warehouses, Data Lakes, and Operational Data Stores

In contrast to an operational data store, a data warehouse typically populates on a batch basis once or twice a day, or in some cases less often than that. The data warehouse is one of many downstream systems that will be fed by an operational data store. A data warehouse will typically enrich and process the data it receives. It has a flatter data schema, which lends itself to reporting and driving management information.

Data lakes often contain unstructured data and lend themselves to speed of consumption over governance of process. To ingest data from a data lake, an operational data store will often utilize a staging area. This serves two purposes. It applies essential validation rules, which allows data to be queried immediately for fast analytics. It also organizes the data such that it can be meaningfully ingested into the ODS, mastered when necessary and linked to related data types. A staging area can often serve the purpose of a data lake, up to a point beyond which the complexity and variety of unstructured data and the processes required to make it meaningful for business purposes demand a dedicated data lake.

Aside from managing unstructured data, staging area also gives firms the option to be selective about which part of a data feed to take into the operational data store. For example, the latest data for an entire universe of securities can arrive in the staging area, but the full validation and mastering process, to ingest data into the operational data store, need only be applied to securities of interest (SoI). This makes the mastering process, including exception management, far more efficient.

Minding the Data Gaps

Even well-known and widely-used investment management or trading, risk and treasury platforms can have gaps in data needs. This leaves an operational data store as the ideal solution. Such data capability gaps can be compounded by having to link a front-office system with a different middle-office system, not to mention separate accounting systems and those of third party outsourced services.

In addition, a firm may trust their investment management system to handle security-level data, but perhaps not for pricing. In an ODS the validation rules for a particular data set can be switched on or off, depending on the level of confidence. Again, this enables efficiency while maintaining standards across the business.

Back to Flexibility

A central, operational data store provide stays constant, which allows a firm to change data or service suppliers, or even business models. An ODS is inherently designed to ingest a variety of data feeds, so a switch in supplier or source system can be managed more easily than if new data sources are to be fed directly into, for example, trading or investment platforms. Regardless of any changes of source, all the data output stays constant from the operational data store.

The way operational data stores close data coverage gaps and offer more flexibility and agility for running business operations and migrating to new business models, makes them a smart choice for upgrading a firm’s operational resilience and capabilities.

All Posts