As risk management stress tests have become commonplace to avert a crisis where a financial institution does not have the capital to meet their financial commitments, the Basel Committee on Banking Supervision has released a consultative document on stress testing principles and also a range of practices report on supervisory and bank stress testing, both at the end of 2017. The range of practices report describes and compares supervisory and bank stress testing practices and highlights areas of evolution. The report finds that, in recent years, both banks and authorities have made significant advances in stress testing methodologies and infrastructure. While much of these gains are born from the obvious need for a rapid response to a potential financial crisis, the changes should be generally embraced. The report draws primarily on the results of two surveys, respectively completed by Basel Committee member authorities and by banks (54 respondent banks from across 24 countries, including 20 global systemically important banks).
While there is much to gain from these reports’ findings, it also highlights a key challenge for banks that should be taken note of: there is a current lack “of systems needed to efficiently aggregate data from across the banking group for use in stress tests.” Or in other words, a modern day financial institution generates massive amounts of data every day, but producing actionable intelligence from this information remains a challenge, mainly due to technology constraints. “Data is the new natural resource,” but is an underutilized resource if you are unable to efficiently access it in a timely manner to produce needed analyses and reporting. The fact that banks are struggling to efficiently aggregate data across the enterprise should come as no great surprise. Even though the BCBS 239 principles aimed to strengthen banks’ risk data aggregation capabilities and internal risk reporting practices became effective in January 2016, a supervisor’s report published by BIS earlier this year concluded “while some progress has been made, most G-SIBs have not fully implemented the principles and the level of compliance with the principles is unsatisfactory.”
It is clear that banks need to act to improve their capabilities for risk data aggregation across the enterprise. An example of a system capable of processing this voluminous amount of data produced is IBM® Algo® Aggregation, which was recently awarded a 2018 RegTech Award for best solution to address an FRTB requirement. This enterprise environment provides a highly flexible analysis and reporting solution, designed to quickly and efficiently integrate with a bank’s existing infrastructure. With intelligent in-memory caching of relevant information, the solution supports interactive dashboards capable of drilling into the underlying data and uses analytics, and avoids the latency problems common to deploying reporting on an enterprise-wide scale. By doing so, banks will be able to reap the full benefits of stress testing while satisfying the regulators that progress is being made.
For an in-depth discussion of the challenges that banks are face when addressing risk data aggregation with existing architectures and tools, and how a different approach can help organizations consolidate risk data and analytics more effectively, we recommend reading the following whitepaper, published by IBM Watson Financial Services, “Can your risk engine keep pace?”