With:
A little over two months ago, the EBA published its findings from the targeted review of internal models (TRIM). The message it brings is difficult to ignore. Banks that formed part of the TRIM exercise largely underestimated the risks and thus underestimated their capital requirements severely. This resulted in an overall increase of 275 billion euros in aggregated RWA’s for financial institutions (FI’s) being part of the exercise. At the basis of this underestimation: poor internal model data management and overall deficiencies in the model management itself.
In the wake of the TRIM results, Basel IV presents itself on the regulatory horizon, bringing what is possibly the biggest change for IRB banks since Basel II. Not only does Basel IV restrict the usage of internal models for certain exposure classes, but it also introduces for the first time a floor to the IRB models, based on a revised standardized approach. Looking at the scale of the problem that TRIM brought to the surface, it is not surprising that the EBA has taken a stronger interest in the internal models used at FI’s. An interest that doesn’t seem like it will go away in the years to come. On the positive side, the recommendations that flow from TRIM could serve as a front-load for adopting the new Basel IV regulation.
Going one step further, what does all this mean for FI’s and their IT organization? How and where should they invest to ensure compliance with regulatory requirements on internal models? What are key features which regulatory applications (legacy or new) will need to have to be future proof in this new era? It all comes down to the following four:
1. Consistent data management & governance
Risk data is often collected from various source systems using archaic data integration tools that offer little data management capabilities to keep up with regulatory and internal demands in terms of transparency & governance. A complete UI driven ETL offers the user the full overview of the data flow from source to report and allows the risk users themselves to work with the data has become a basic necessity that is perceived as lacking more often than not. Data governance has long been a slogan but has become a reality in an ever-changing world.
2. Strong model management features
Designing and calibrating models is an art, which is performed by gifted modellers in financial institutions. The data analytics tools used are often very powerful, but they are what they are, i.e. tools. All too often, huge effort and money are invested in constructing the models, with the model execution process being merely an afterthought. The model itself is conceived as an executable script that was never transformed into a useable (model) solution. This has resulted in an evident lack of model management capabilities, which becomes ever more pressing in turbulent times. As with data management, the risk user needs to be empowered to work with, change, version and publish models, all governed by an overarching governance process. Those are the necessary ingredients to support the immediate regulatory demands and future real-life demands for risk departments.
3. A flexible framework for model monitoring & model validation
Model monitoring combines the key stats of the model after each model execution and is often conducted by either the model development team or the risk users. Whoever monitors the models should have a complete understanding of the model’s performance in terms of overall distribution outcomes, impacts of data deficiencies (model variables) and impacts of the outcomes (risk factors). Model monitoring also requires the capability to drill through to the individual input parameters on a counterparty or account level to check data completeness and accuracy. Internal Model validation is then the independent challenge of the underlying model methodologies and a review of the analysis of outcomes. Model validation is often considered as a periodic (often yearly) and mostly manual exercise by the validation team that operates independently from the model development team.
Model validation is an ongoing process well embedded in the risk departments operations. On top of the data management and model monitoring features described above, internal model validation requires a sandbox approach to run, benchmark and backtest models. TRIM indicates the model validation as one area where progress is being made, but that remains a challenge for financial institutions.
4. Performance
Flexibility will be key for applications, having the ability to efficiently run and re-run different scenarios for the calibration of internal models. Not only will applications need the flexibility to do this in a user-friendly way, but they will also need to run them quickly. Performance is the name of the game, as it will provide the FI with a competitive advantage as it allows them to act and adapt more quickly.
The outcome of TRIM is a real wake-up call for all FI’s that are using internal models. The sheer number of findings is undoubtedly a concern, but the nature of the findings is even more worrying. For example, in the most commonly used type of internal models, e.g. PD models for retail and SME portfolios, TRIM issued findings in more than half of the models concerning the low differentiation owing to the low discriminatory power of the scoring function. This in itself means that the modellers need to go back to the drawing board.
For the low-default portfolios, Basel IV simply restricts/prohibits the use of models, ever since regulators realized that they performed poorly in stress tests and lacked consistency on overall model outcomes. However, stress testing is becoming the new normal as a way for FI’s to cope with the more rapidly changing nature of the world we live in (cfr COVID). So it’s safe to assume that the focus on internal models from a regulatory perspective will only increase.
FI’s thus are faced with significant investments to be made in order to counter the findings from TRIM and the overall focus on internal models. Investments in the years to come will need to be made in:
- Future proofing IT infrastructure with a focus on flexibility and performance and supporting applications in scenario analysis and overall calculation performance. FI’s focus should be on model construction & calibration whilst performing the actual calculations should be positioned in best of break dedicated applications
- Putting in place a centralized model management framework to support the above. Providing a transparent view on model versions, approval cycles,… Supporting the stressing of current models, subsequent adaption and deployment whilst providing full traceability and transparency.
- Focusing on clean and consistent data at the center of it all. Failure to adhere to this will unavoidably lead to violations of the standardized floor restricting the usage of internal models in which so much money and resources have been invested.
Both TRIM and Basel IV show us that the future will not only bring a shift in the way FI’s make their calculations, but also on the tools used and the surrounding processes. This means that the new wave will be as much about IT change as it will be about regulatory change.
Our new Basel IV eBook looks at how a dynamic risk management approach prepares banks better for the future.
The Basel IV revision now offers a renewed opportunity to re-align the Basel process with the Bank’s risk management practices as the expectation is that things won’t go back to a stable reporting cycle.
The content includes:
- Management vs Compliance
- How prepared are you?
- Technology for future risk and compliance
- Stress testing
- Customer Case Study