GoldenSource Blog
goldensource launces risk factor taxonomy, goldensource launches real time investment book of record,goldensource appoints new head of product,goldensource factset integration,goldensource introduces version 8.8,GoldenSource launches esg impact,goldensource hires jeremy katzeff,goldensource hires xavier gerardin

Learning from Lehman: Your FRTB Implementation Needs to be Built on Bedrock

Despite the news that FRTB will be delayed beyond 2019, banks can ill afford to kick their FRTB implementation into the long grass. This is due to the sheer scale and complexity the FRTB framework. Firms need to work out exactly how much cash will be needed to underpin their market risk.

Origins of FRTB

Around this time 10 years ago, a certain investment bank’s stock plummeted. The concern was that its short-term liabilities were far greater than its liquid assets. Lehman Brothers failed to address a crucial issue, which was how much capital it needed to hold.

Typically, banks need to review their capital for operational, credit and market risk. And a decade on from Lehman’s demise, it is assessing the latter which has triggered a new set of rules. These force banks to calculate exactly how much capital is needed to protect themselves from sharp price falls. These prescriptive and global measures are commonly known as the Fundamental Review of the Trading Book (FRTB).

Banks operating in less liquid markets face a tougher FRTB implementation challenge

The difficulty is assessing which of their diverse range of assets hold the most risk. This is particularly challenging for banks operating in traditionally less liquid and emerging markets. A prime example is a bank based in APAC. Here there is neither a single currency such as the euro, nor a default reserve currency like the U.S. dollar.

A common challenge a bank like this faces is trying to start with FRTB calculations, such as assessing the value of assets at risk (VAR), before shoe-horning in the vital information that determines the asset’s value. It is the equivalent of building a house on sand instead of bedrock. Or to put it another way, taking the Lehman Brothers approach to tackling FRTB.

GoldenSource Talks: Fundamental Review of the Trading Book

Having the right data for your FRTB implementation more challenging than the calculations

There is a point that so many firms are struggling to grasp. It is that doing the calculations, from expected shortfall to risk weighted sensitivities, is not the main issue. The real challenge is assembling the right information to underpin the calculations. Call it taking the high-end builders approach to FRTB – laying the foundations. Easily said, but what does it look like and how can it be done?

Most of the FRTB calculations require the marriage of market and risk data. Historically, banks have struggled to achieve this without introducing errors. This is often because the granularity at which these data types are held does not match up. And with over 10 years’ worth of market data requiring assessment under FRTB, many now face an unprecedented challenge.

Without collating this backlog of information, and without ingesting new data from the traditional market data vendors, banks will not be able to identify and address any non-modellable risk factors (NMRFs). It is these NMRFs that have the biggest effect on whether trading can be done on the internal model approach. Or whether trading must be carried out under higher capital constraints, which could lead to lower profitability. The greater the number of modellable risk factors, the more likely an internal model approach will receive regulatory approval. The more desks running on the internal model approach, the smaller (relatively) the pool of capital that needs to be set aside.

Auditable calculations and results are essential

Also, it is essential that a bank’s calculations are fully auditable. For this it needs to be able to pull together and store market, position and risk data, plus calculation results. This includes intel such as the contributor of the market data, its sensitivity, and exactly when it was distributed. A bank, as a case in point, may well have a portfolio of different interest rate positions.

In this situation, a risk officer needs to fully understand the nature of each position. Banks can only be confident that they have accurate FRTB calculations if they understand the market and risk data points that help quantify the difference between positions. This is regardless of whether they are measured in credit spreads or basis and volatility points.

Would the Lehman saga have happened under FRTB?

Even all these years on, many still struggle to fully comprehend the Lehman saga. Would it have been possible, if FRTB had been enforced way back in the early 2000s, to ignore key information underpinning calculations for mortgage backed securities? Would FRTB have made life easier for those taking the decisions that ultimately, changed the world?

One thing’s for certain, banks must not build their FRTB solutions on sand, as opposed to a bedrock of data. If they do, they’ll soon find, as Lehman’s did, that its assets may not be enough to cover its liabilities.

Contact us to learn more

All Posts