What is data mastering?
Data mastering is the process of establishing a normalized instance of a unique data point or data record so that it can be referenced, linked or merged meaningfully with other data. Data mastering is often preceded by the ingestion or loading of source data and often followed by the creation of gold copy or golden record, mastered data sets. In many businesses, data mastering is a key operational element of master data management (MDM) and enterprise data management (EDM) disciplines, supported by purpose-built data management solutions and services.
What is a master data record?
A master data record is the consolidated, validated instance of data attributes about a specific item or object, such as a customer, product or financial security. Having a master data record, or gold copy, eliminates duplicate data and establishes standard data formats, which enables authoritative data to be used consistently throughout a business.
What are the types of master data?
Types of master data used by financial firms include securities, customers and counterparties, entities, corporate actions, products, funds/accounts, ESG, pricing, risk, positions and balances, and benchmark and indices. Mastered data types are often referred to as reference or static data. The range of data domains that are mastered is determined by the lines of business a firm operates.
Why do I need data mastering?
Firms need data mastering to ensure smooth and efficient business processes. Mastered data is quality checked and is the most complete and trusted version of data in the business. This means that business decisions are made with the as much information and insight as is available. It means that business processes are more likely to run uninterrupted by missing or improperly formatted data. In financial services, this results in a superior customer experience, greater straight through processing (STP) and fewer operational risk events
How do you manage master data?
Firms manage master data by using tools that ingest data from multiple external and internal sources, then validate, clean, complete and augment it. Finally it is organized and stored for subsequent access by users and the distribution of data sets for use in business systems.
Data mastering results in high quality data. Each master data record is unique (i.e. there are no duplicates) and represents the most fit-for-purpose data available within the business. For more information as to the characteristics of quality data, please see our blog, GoldenSource 101: Data Quality.
What should I look for in a good one?
A good data mastering solution natively understands the types of data you’re dealing with and the relationships between individual data attributes. This relies on a data model that is specific to the financial services business sector. Further benefits are achieved if the data model has full integration with data vendors’ feeds (and other external data sources), with upstream systems, to incorporate internal data into master records, and to downstream systems for ease of distributing data throughout the business. A documented data dictionary also ensures that all data attributes are commonly understood throughout the firm, which ensures consistent usage and reduces the need for reconciliation processes.