GoldenSource Blog
market data management, multi cloud data management, Docker and Kubernetes, TCFD, Investment book of record, data wrangling

GoldenSource 101: Data Mastering

What is data mastering?

Data mastering is the process of establishing a normalized instance of a unique data point or data record so that it can be referenced, linked or merged meaningfully with other data. Data mastering is often preceded by the ingestion or loading of source data and often followed by the creation of gold copy or golden record, mastered data sets. Data mastering can also involve merging an unmastered data record into a mastered one.

In many businesses, data mastering is a key operational element of master data management (MDM) and enterprise data management (EDM) disciplines, supported by purpose-built data management solutions and services. Data mastering as a concept and function may be easily confused with MDM, but it is not the same thing. Data mastering is a way to get more value out of data and to derive insights from your data.

What is a master data record?

A master data record is the consolidated, validated instance of data attributes about a specific item or object, such as a customer, product or financial security. Having a master data record, or gold copy, eliminates duplicate data and establishes standard data formats, which enables authoritative data to be used consistently throughout a business.

What are the types of master data?

Types of master data used by financial firms include securities, customers and counterparties, entities, corporate actions, products, funds/accounts, ESG, pricing, risk, positions and balances, and benchmark and indices. Mastered data types are often referred to as reference or static data. The range of data domains that are mastered is determined by the lines of business a firm operates. In broader contexts, master data types include customer data, product data, financial data and transaction data.

Why do I need data mastering?

Firms need data mastering to ensure smooth and efficient business processes. Mastered data is quality checked and is the most complete and trusted version of data in the business. Mastered data also provides a central and complete view of everything happening in your firm. With that resource, business decisions are made with as much information and insight as is available. It means that business processes are more likely to run uninterrupted by missing or improperly formatted data. That in turn increases efficiency, decreases the workload for data managers and increases trust in the data you have collected. In financial services, this results in a superior customer experience, greater straight through processing (STP) and fewer operational risk events.

How do you manage master data?

Firms manage master data by using tools that ingest data from multiple external and internal sources, then validate, clean, complete and augment it. Finally it is organized and stored for subsequent access by users and the distribution of data sets for use in business systems. You can do all of this using MDM systems, but a system cannot do everything. Firms should set up an MDM process that will work, starting with identifying their data sources and who’s going to consume the data on the other end. They must make sure the firm is ready to use MDM, then be prepared to test, update and maintain MDM systems. The data mastering process produces high quality data. Each master data record is unique (i.e. there are no duplicates) and represents the most fit-for-purpose data available within the business. For more information as to the characteristics of quality data, please see our blog, GoldenSource 101: Data Quality.

What should I look for in a good one?

A good data mastering solution natively understands the types of data you’re dealing with and the relationships between individual data attributes. This relies on a data model that is specific to the financial services business sector. Further benefits are achieved if the data model has full integration with data vendors’ feeds (and other external data sources), with upstream systems, to incorporate internal data into master records, and to downstream systems for ease of distributing data throughout the business. A documented data dictionary also ensures that all data attributes are commonly understood throughout the firm, which ensures consistent usage and reduces the need for reconciliation processes.

Beyond these functional aspects, you should also look for data mastering that turns the process from a cost center to a value producer. You should be mindful of what the firm’s leaders want from a data mastering system and what your firm’s performance goals are for its data, and choose a system that also honors those interests.

All Posts