Asset Managers have tackled pricing issues for years. But what they haven’t faced – until now – is a minefield of thinly-traded markets and unprecedented regulatory scrutiny.
While the subject of pricing isn’t new, it has never been more explosive – and the old way of running processes is being put to the test.
Download the paper, Buy Side Price Validation: Tackling the New Pricing Challenges, to find out how the industry is tackling the issues.
Download the GoldenSource Price Side Validation WhitePaper
Large organisations commonly recognize the need to effectively manage their client data not as a strategic imperative, but rather as a function of projects that are conceived to solve a specific challenge.
These projects will typically be based on one of a number of disparate drivers including:
The aim of the Markets in Financial Instruments Directive (MiFID) is simple: to allow financial services institutions in the EU to provide their services across borders, while increasing transparency and investor protection for retail and institutional customers. Its execution is far more complicated however, involving significant detailed requirements that will affect Organisation and conduct of business of investment firms.
In short, MiFID is designed to uphold the integrity and overall efficiency of the European financial system in a competitive global market. As a result of the emerging regulatory framework the extensive changes surrounding the way data is gathered, managed, shared and retained will have significant impact for both buy side and sell side firms. This paper sheds light on some of the data issues stirred up by the MiFID revolution and suggests four pragmatic ways of addressing those needs.
How the retail industry manages its supply chain could hold the key to tackling systemic risk and achieving transparency in the financial markets. Regulatory mandates call for the financial services industry to collaborate and rethink its own data supply chain, writes Stephen Engdahl, SVP Product Strategy, GoldenSource.
Financial services is an industry of information. It is being recognized as one of the most technologically advanced sectors in the world. However, despite this, critical processes often fall short of those in industries with a physical product, such as retail.
This paper looks at how lessons from other industries such as retail can transform the financial data supply chain.
The industry’s drive to understand and reduce systemic risk has created fresh challenges around instrument and entity data that put technology under a whole new light.
However, in a post-2008 crisis landscape dominated by regulatory reform, compliance is only part of the issue. If firms can address how they manage multiple data sets and deploy a truly enterprise-wide model, they can capitalize on the real opportunity – achieving a competitive advantage.
This paper discusses how, by combining instruments and entities on a single platform, firms are able to benefit from reduced operational risk, more efficient client service and on-boarding and greater business agility.
Institutions now recognize the importance of good data management practices in order to improve their risk practices.
Now, more than ever before they are prepared to do something about it.
This paper explores why so many institutions are now focusing on data as it relates to risk, and what has changed.
This white paper discusses GoldenSource’s 360 EDM approach to data management. Get a 360-degree view of your business to not only increase profits by reducing operating costs and preventing expensive errors, but also to eliminate blinds spots due to inconsistent and/or incomplete data.
This paper discusses how a data center warehouse fits into and overall enterprise data management (EDM) strategy.
It identifies 3 common business problems in the financial services industry and how a data warehouse can be used to solve these typical industry issues.
The Solvency II directive provides for an EU-wide, consistent, risk-based approach to determining capital requirements for insurance firms. When fully adopted – by January 2014 – the framework set forth through Solvency II will protect the insurance policy holder from risk of insurance company failure.
The requirements of Solvency II primarily apply to insurance firms domiciled in the European Union. However, the impact is broader. The new regulations also place an indirect burden on those asset management firms which serve insurers. Both insurance companies and their service providers must recognize how the key components of Solvency II will affect them, and must adopt a solid framework for achieving Solvency II compliance. Given the timeline for compliance, underlying infrastructure such as data management and data quality systems must be selected and implemented immediately.
This paper describes the key requirements of Solvency II, and assesses the data management impact for each. Best practices and recommendations for how firms can take a strategic approach to Solvency II are provided, to enable asset managers to turn compliance into a source of competitive advantage.
This paper describes the current difficulty in automating the processing of complex instruments. It presents an overview of how the interconnected data delivered by Enterprise Data Management can be used to deliver increased automation and mitigate the not inconsiderable risks of exposure to unclear risks, such as counterparty exposure.
To successfully manage the new generation of complex structured financial products, financial institutions need to be able to gain a clear view of all the facts associated with a product: instruments, customers, counterparties, trades and positions. They must have a full understanding of how these facts are linked. Enterprise Data Management (EDM) provides a rich global financial data model with a high degree of flexibility, enabling financial institutions to create and manage complex financial instruments and understand all the relationships they encapsulate.