Asset managers have tackled pricing issues for years. But what they haven’t faced – until now – is a minefield of thinly-traded markets and unprecedented regulatory scrutiny. While the subject of pricing isn't new, it has never been more explosive – and the old way of running processes is being put to the test.
Download the paper, Buy Side Price Validation: Tackling the New Pricing Challenges, to find out how the industry is tackling the issues.
Large organisations commonly recognise the need to effectively manage their client data not as a strategic imperative, but rather as a function of projects that are conceived to solve a specific challenge. These projects will typically be based on one of a number of disparate drivers including CRM initiatives to identify new sales opportunities and drive revenue growth, regulatory imperatives such as the global banking reforms, technology drivers (e.g. system obsolescence), organisational restructuring or internal process-optimisation targets.
Asset managers have tackled pricing issues for years. But what they haven't faced - until now - is a minefield of thinly-traded markets and unprecedented regulatory scrutiny. While the subject of pricing isn't new, it has never been more explosive – and the old way of running processes is being put to the test.
How the retail industry manages its supply chain could hold the key to tackling systemic risk and achieving transparency in the financial markets. Regulatory mandates call for the financial services industry to collaborate and rethink its own data supply chain, writes Stephen Engdahl, SVP Product Strategy, GoldenSource.
The industry’s drive to understand and reduce systemic risk has created fresh challenges around instrument and entity data that put technology under a whole new light. However, in a post-2008 crisis landscape dominated by regulatory reform, compliance is only part of the issue. If firms can address how they manage multiple data sets and deploy a truly enterprise-wide model, they can capitalise on the real opportunity – achieving a competitive advantage. This paper discusses how, by combining instruments and entities on a single platform, firms are able to benefit from reduced operational risk, more efficient client service and on-boarding and greater business agility.
Institutions now recognize the importance of good data management practices in order to improve their risk practices, and more than ever before they are prepared to do something about it. This paper explores why so many institutions are now focusing on data as it relates to risk, and what has changed.
This whitepaper discusses GoldenSource's 360 EDM approach. The 360 EDM approach provides a 360-degree view of your business to not only increase profits by reducing operating costs and preventing expensive errors, but also to eliminate blinds spots due to inconsistent and/or incomplete data.
Written By Tom Stock, SVP of Product Management for GoldenSource
This paper discusses how a data warehouse fits into an overall enterprise data management (EDM) strategy. It discusses three common business problems in the financial services industry and how a data warehouse can be used to solve these typical industry issues.
The Solvency II directive provides for an EU-wide, consistent, risk-based approach to determining capital requirements for insurance firms. When fully adopted - by January 2014 - the framework set forth through Solvency II will protect the insurance policy holder from risk of insurance company failure.
This paper describes the current difficulty in automating the processing of complex instruments and presents an overview of how the interconnected data delivered by Enterprise Data Management can be used to deliver increased automation and mitigate the not inconsiderable risks of exposure to unclear risks, such as counterparty exposure.