White Papers

The Proposal for Machine Learning Whitepaper

Data quality is about continuous improvement and is much more than a ‘fit-for-purpose’ check at a moment in time on a particular data set.

Although there has been extensive research on derivative pricing models, no study to date has systematically reviewed the risk factor inputs to these models. My PHD research seeks to address this research gap.

Charlie-Browne-Portrait

Author Charlie Browne

Read the full Proposal for Machine Learning Project Whitepaper here

 

Integrity and ethics for researchers using big data and machine learning techniques in derivative pricing

The purpose of this essay is to critically review the issues associated with research integrity and ethics as it applies to data science techniques for derivative pricing models. I hope the content will provide food for thought and a useful reference for data scientists and chief data officers who seek to ensure ethics and integrity in data mining, model construction, analytics, pricing and the recommendations in research reports to clients.

Charlie-Browne-Portrait

Author Charlie Browne

Download the Integrity and ethics for researchers using big data and machine learning techniques in derivative pricing whitepaper

integrity and ethics

A Data-Centric View of Liquidity in Valuations and Market Risk

This paper addresses data considerations for daily market data, real-price observations,

Charlie-Browne-Portrait

Author Charlie Browne

time-series data, and risk factors. It includes a proposal for a centralized approach to IPV, Valuations, Daily P&L, FRTB IMA and the Risk Factor Eligibility Test. Data and data alignment is 80% of the work for all these things.

The interaction between Valuations teams (often sitting in Finance and including both IPV and PRUVAL teams) and Market Risk teams (that roll up to the Risk function) is a topic that is of crucial importance for both FRTB and for the efficient operation of the bank’s trading function, which sits in and is run from the Front Office.

The design of relationships between the data stores that exist across Valuations and Market Risk teams will be central to the success of the coordinated approach that regulations such as FRTB demand. If the data stores underlying Finance and Market Risk are aligned and adhere to a robust set of data modelling and lineage principles then the calculations become straight forward.

Download the Data-Centric View of Liquidity in Valuations and Market Risk whitepaper

Leveraging Data for Buy Side Risk Analytics

Charlie-Browne-Portrait

Author Charlie Browne

Purchasing analytics directly from vendors has the benefit of being a very simple operating model, but this approach offers neither flexibility nor transparency into the underlying calculations.

With high portfolio returns becoming increasingly difficult to generate, firms are choosing to refocus their efforts by adopting technology that offers better insights into their product portfolios and profitable investment strategies. A key aspect of this is the ability to leverage Enterprise Data Management (EDM) approaches that will drive the business forward over the long term.

By being better able to analyze time-series of prices, risk factors and portfolio static data, firms can carve out an edge in their portfolio optimization approaches and strategies, such as smart beta and factor returns analysis.

Download the Leveraging data for Buy Side Risk Analytics Whitepaper

A Data-Centric View of Liquidity

Charlie-Browne-Portrait

Author Charlie Browne

Market liquidity is a market’s ability to purchase or sell an asset without causing a material change in the price of the asset. It describes the asset’s ability to sell quickly without having to reduce its price.

Liquidity is about how big the trade-off is between the speed of the sale and the price it can be sold for. In a liquid market, the trade-off is mild: selling quickly will not reduce the price much. In a relatively illiquid market, selling it quickly will require cutting its price by some amount.

Download the GoldenSource Data Centric View of Liquidity WhitePaper

data centric view liquidity

Buy Side Price Validation: Tackling the New Pricing Challenges

Charlie-Browne-Portrait

Author Charlie Browne

Asset Managers have tackled pricing issues for years. But what they haven’t faced – until now – is a minefield of thinly-traded markets and unprecedented regulatory scrutiny.

While the subject of pricing isn’t new, it has never been more explosive – and the old way of running processes is being put to the test.

Download the paper, Buy Side Price Validation: Tackling the New Pricing Challenges, to find out how the industry is tackling the issues.

Download the GoldenSource Price Side Validation WhitePaper

Price-Side-Validation-White-Paper-Thumbnail

Complex Client Structures and Fragmented Systems Landscapes

Large organisations commonly recognize the need to effectively manage their client data not as a strategic imperative, but rather as a function of projects that are conceived to solve a specific challenge.

These projects will typically be based on one of a number of disparate drivers including:

  • CRM initiatives to identify new sales opportunities and drive revenue growth
  • Regulatory imperatives such as the global banking reforms
  • Technology drivers (e.g. system obsolescence)
  • Organisational restructuring or internal process-optimisation targets

Download the Client Structures White Paper

Complex-Client-Structures-Fragmented Systems-landscapes

MiFID: How to Deal with 4 Critical Reference Data Issues

The aim of the Markets in Financial Instruments Directive (MiFID) is simple: to allow financial services institutions in the EU to provide their services across borders, while increasing transparency and investor protection for retail and institutional customers. Its execution is far more complicated however, involving significant detailed requirements that will affect Organisation and conduct of business of investment firms.

  • Operation of regulated markets.
  • New pre- and post-trade transparency requirements for equity markets.
  • Creation of a new regime for ‘systematic internalisers’ of retail order flow in liquid equities.
  • More extensive transaction reporting requirements.

In short, MiFID is designed to uphold the integrity and overall efficiency of the European financial system in a competitive global market. As a result of the emerging regulatory framework the extensive changes surrounding the way data is gathered, managed, shared and retained will have significant impact for both buy side and sell side firms. This paper sheds light on some of the data issues stirred up by the MiFID revolution and suggests four pragmatic ways of addressing those needs.

Download the MiFID II White Paper

MiFID-2- How to Deal with Reference Data White-Paper

Goods in transit: How Lessons from Retail can Transform the Financial Data Supply Chain

How the retail industry manages its supply chain could hold the key to tackling systemic risk and achieving transparency in the financial markets. Regulatory mandates call for the financial services industry to collaborate and rethink its own data supply chain, writes Stephen Engdahl, SVP Product Strategy, GoldenSource. 

Financial services is an industry of information. It is being recognized as one of the most technologically advanced sectors in the world. However, despite this, critical processes often fall short of those in industries with a physical product, such as retail.

This paper looks at how lessons from other industries such as retail can transform the financial data supply chain.

Download Lessons from Retail White Paper

Lessons-from-Retail-White-Paper-Thumbnail

 

Uniting Entity Data – The Missed Opportunity

The industry’s drive to understand and reduce systemic risk has created fresh challenges around instrument and entity data that put technology under a whole new light.

However, in a post-2008 crisis landscape dominated by regulatory reform, compliance is only part of the issue. If firms can address how they manage multiple data sets and deploy a truly enterprise-wide model, they can capitalize on the real opportunity – achieving a competitive advantage.

This paper discusses how, by combining instruments and entities on a single platform, firms are able to benefit from reduced operational risk, more efficient client service and on-boarding and greater business agility.

Data Management to Improve Risk Management for the Buy Side

Institutions now recognize the importance of good data management practices in order to improve their risk practices.

Now, more than ever before they are prepared to do something about it.

This paper explores why so many institutions are now focusing on data as it relates to risk, and what has changed.

Download the Data Management for Risk Management White Paper

Data-management-to-Improve-Risk-Management-White-Paper-Thumbnail

GoldenSource 360 EDM - Breakthroughs in Data Management Getting from 0 to 360

This white paper discusses GoldenSource’s 360 EDM approach to data management. Get a 360-degree view of your business to not only increase profits by reducing operating costs and preventing expensive errors, but also to eliminate blinds spots due to inconsistent and/or incomplete data.

Download GoldenSource 360 EDM Breakthroughs in Data Management White Paper

Breakthroughs-in-Data-Management-White-Paper-Thumbnail

Using a Data Warehouse to Solve Risk, Performance, Reporting and Compliance-related Issues

Tom Stock PortraitWritten By Tom Stock, SVP of Product Management for GoldenSource

This paper discusses how a data center warehouse fits into and overall enterprise data management (EDM) strategy.

It identifies 3 common business problems in the financial services industry and how a data warehouse can be used to solve these typical industry issues.

Success with Solvency II: Challenges and Opportunities for Asset Managers

The Solvency II directive provides for an EU-wide, consistent, risk-based approach to determining capital requirements for insurance firms. When fully adopted – by January 2014 – the framework set forth through Solvency II will protect the insurance policy holder from risk of insurance company failure.

The requirements of Solvency II primarily apply to insurance firms domiciled in the European Union. However, the impact is broader. The new regulations also place an indirect burden on those asset management firms which serve insurers. Both insurance companies and their service providers must recognize how the key components of Solvency II will affect them, and must adopt a solid framework for achieving Solvency II compliance. Given the timeline for compliance, underlying infrastructure such as data management and data quality systems must be selected and implemented immediately.

This paper describes the key requirements of Solvency II, and assesses the data management impact for each. Best practices and recommendations for how firms can take a strategic approach to Solvency II are provided, to enable asset managers to turn compliance into a source of competitive advantage.

Solvency II Whitepaper

Structured Products - Using EDM to Manage Risk

This paper describes the current difficulty in automating the processing of complex instruments. It presents an overview of how the interconnected data delivered by Enterprise Data Management can be used to deliver increased automation and mitigate the not inconsiderable risks of exposure to unclear risks, such as counterparty exposure.

To successfully manage the new generation of complex structured financial products, financial institutions need to be able to gain a clear view of all the facts associated with a product: instruments, customers, counterparties, trades and positions. They must have a full understanding of how these facts are linked. Enterprise Data Management (EDM) provides a rich global financial data model with a high degree of flexibility, enabling financial institutions to create and manage complex financial instruments and understand all the relationships they encapsulate.