Implications of Alternative Operational Risk Modeling Techniques (original) (raw)

Using Loss Data to Quantify Operational Risk

SSRN Electronic Journal, 2003

Management and quantification of operational risk has been impeded by the lack of internal or external data on operational losses. We consider newly available data collected from public information sources, and show how such data can be used to quantify operational risk for large internationally active banks. We find that operational losses are an important source of risk for such banks, and that the capital charge for operational risk will often exceed the charge for market risk. Although operational risk capital will vary depending on the size and scope of a bank's activities, our results are consistent with the 2-7 billion dollars in capital some large internationally active banks are currently allocating for operational risk.

A loss distribution for operational risk derived from pooled bank losses

The Basel II accord encourages banks to develop their own advanced measurement approaches (AMA). However, the paucity of loss data implies that an individual bank cannot obtain a probability distribution with any reliability. We propose a model, targeting the regulator initially, by obtaining a probability distribution for loss magnitude using pooled annual risk losses from the banks under the regulator's oversight. We start with summarized loss data from 63 European banks and adjust the probability distribution obtained for losses that go unreported by falling below the threshold level. Using our model, the regulator has a tool for understanding the extent of annual operational losses across all the banks under its supervision. The regulator can use the model on an ongoing basis to make comparisons in year-on-year changes to the operational risk profile of the regulated banking sector. The Basel II accord lays out three possibilities for calculating the minimum capital reserve required to cover operational risk losses: the basic approach, the standardized approach, and the advanced measurement approach (AMA). The latter is specific to an individual bank that uses its own approach to determine capital requirements for its different lines of business and for the bank as a whole. A typical AMA model uses a probability distribution for loss per incident of a certain category and another for the number of incidents in that category, although there are other modeling approaches as well. A problem with this approach then is the paucity of loss data available for any particular bank to obtain such distributions. We obtain a probability distribution for operational risk loss impact using summarized results of pooled operational risk losses from multiple banks. Doing so allows us to derive simple AMA models for the regulators using data from the banks they oversee. One possibility is that the regulator can obtain an estimate for the capital requirement for a 'typical' bank under its supervision. We use data from 63 banks that the distribution fits annual losses very well. Moreover, we adjust for the fact that the regulator sees only losses above a certain threshold, say €10,000.

Measuring operational risk in financial institutions

Applied Financial Economics, 2012

The scarcity of internal loss databases tends to hinder the use of the advanced approaches for operational risk measurement (Advanced Measurement Approaches (AMA)) in financial institutions. As there is a greater variety in credit risk modelling, this article explores the applicability of a modified version of CreditRisk+ to operational loss data. Our adapted model, OpRisk+, works out very satisfying Values-at-Risk (VaR) at 95% level as compared with estimates drawn from sophisticated AMA models. OpRisk+ proves to be especially worthy in the case of small samples, where more complex methods cannot be applied. OpRisk+ could therefore be used to fit the body of the distribution of operational losses up to the 95%-percentile, while Extreme Value Theory (EVT), external databases or scenario analysis should be used beyond this quantile.

An Approach to Modelling Operational Risk in Banks

WORKING PAPER SERIES-HENLEY …, 1999

While much work has been done in recent years on developing models to measure Market and Credit risks in banks and securities firms, there have been fewer attempts to model other important risks, in particular Operational Risk. This may be because, although recognised as important, there is little agreement on what constitutes Operational Risk. There is some agreement, however, that Operational Risk arises from a "breakdown" in operational processes and that Internal Audit has a key role in identifying potential operational breakdowns. This paper draws on the well-established theories of Reliability developed in Operational Research to propose models for estimating the likelihood of failures/breakdowns occurring in operational processes and for estimating the losses that might result from such breakdowns. Using these models, the paper demonstrates how a Value at Risk may be computed for the set of processes that comprise the operations of a bank. The role of Internal Audit in calibrating and testing such models is also highlighted.

The Determinants of Operational Risk in U.S. Financial Institutions

Journal of Financial and Quantitative Analysis, 2011

We examine the incidence of operational losses among U.S. financial institutions using publicly reported loss data from 1980 to 2005. We show that most operational losses can be traced to a breakdown of internal control, and that firms suffering from these losses tend to be younger and more complex, and have higher credit risk, more antitakeover provisions, and chief executive officers (CEOs) with higher stock option holdings and bonuses relative to salary. These findings highlight the correlation between operational risk and credit risk, as well as the role of corporate governance and proper managerial incentives in mitigating operational risk. Bank of Chicago for constructive suggestions. We are grateful to Algorithmics Inc. (a member of the Fitch Group) for providing operational loss data. Chernobai acknowledges financial support from Syracuse University.

Towards a framework for operational risk management in the banking sector

PHD Thesis, 2022

The objective of this study is three-pronged. One, it investigates the factors that influence capital adequacy as measured by the covariates (exposure, frequency and severity) used in banking operations that accompany firms data-log loss reports. Two, it assesses the differential impact of discretionary (by adding artificial data) and non-discretionary (using real world data) loss disclosure on firms’ value-at-risk. R software is used to determine the value-at-risk. GLM and GAMLSS techniques are employed and subsequent tests of significance derive aforementioned influential factors, accompanied by a data augmentation algorithm in Matlab software to determine the differential impact of artificial and real world operational loss disclosures on firms’ performance in relation to meeting capital requirements. Three, it challenges firms’ risk-neutral assumption inherent in operational risk practice, asserting that; in theory, banking operations are more risk averse. Rattle software is used in a k-means cluster analysis method to determine whether controls compensate for persistent losses due to the firms’ natural risk aversion. The research arrived at estimates on the number of losses and their sizes; whereby exposure positively influences the risk ceded by the bank having “learned” from possible variations in past data, therefore improving operational risk management frameworks by introducing ex ante forward-looking components, whereas the addition of artificial data points by data augmentation circumvents former dilemmas of large and rare events so one can do more “learning”, notwithstanding the nature of the data’s suspect quality as they are constructs not observations. Nevertheless, the artificial intelligent EBOR framework’s performance improves on (Hoohlo (2014)’s applied data scaling and parametization techniques arrived at a proxy of about ZAR3B) former techniques for capital adequacy calculation of OpRisk opening up exploration modeling beyond historical accounts of significance to incorporate forward-looking aspects. Furthermore, checks and balances set up based on operational negligence slow down operational risk losses over time thereby establishing the move of firm risk tolerance levels away from risk neutrality, suggesting that banks are more risk averse.

An Analysis of Operational Risk Events in US and European Banks 2008-2014

Annals of Actuarial Science, 2017

This paper explores the characteristics of 2,141 operational risk events amongst European (EU) and US banks over the period 2008–2014. We have analysed the operational risk events using a method originating in biology for the study of interrelatedness of characteristics in a complex adaptive system. The methodology, called cladistics, provides insights into the relationships between characteristics of operational risk events in banks that is not available from the traditional statistical analysis. We have used cladistics to explore if there are consistent patterns of operational risk characteristics across banks in single and different geographic zones. One significant pattern emerged which indicates there are key, stable characteristics across both geographic zones and across banks in each zone. The results identify the characteristics that could then be managed by the banks to reduce operational risk losses. We also have analysed separately the characteristics of operational risk events for “big” banks and extreme events and these results indicate that big banks and small banks have similar key operational risk characteristics, but the characteristics of extreme operational risk events are different to those for the non-extreme events.

Operational Risk Management and Implications for Bank’s Economic Capital – a Case Study

2008

In this paper we review the actual operational data of an anonymous Central European Bank, using two approaches described in the literature: the loss distribution approach and the extreme value theory (“EVT”). Within the EVT analysis, two estimation methods were applied; the standard maximum likelihood estimation method and the probability weighted method (“PWM”). Our results proved a heavy-tailed pattern of

Measuring Operational Risk in Financial Institutions: Contribution of Credit Risk Modeling

SSRN Electronic Journal, 2005

The scarcity of internal loss databases tends to hinder the use of the advanced approaches for operational risk measurement (AMA) in financial institutions. As there is a greater variety in credit risk modelling, this paper explores the applicability of a modified version of CreditRisk+ to operational loss data. Our adapted model, OpRisk+, works out very satisfying Values-at-Risk at 95% level as compared with estimates drawn from sophisticated AMA models. OpRisk+ proves to be especially worthy in the case of small samples, where more complex methods cannot be applied.

New Tendencies in Operational Risk Management in Banks: Challenges and Opportunities

Humanities and Social Sciences. Latvia, 2022

The importance of operational risk management in the bank increases every year. Banks need to take actions to prevent fraudulent activities, minimize errors in transactions, automate processes and improve data security. Ignoring operational risk procedures or failure to implement suitable control mechanisms could lead to unexpected losses, unsatisfied customers, and potentially regulatory sanctions, all of which could seriously harm bank's reputation in a highly competitive market. A specific focus is on payments and security transactions, as they are linked to the biggest risks. Any regulatory driven project failure or IT project failure in the bank, insufficient project governance, failed implementation of a new system or failure in external data sources can lead to even bigger losses. After a review of the Basel Framework and the new set of standards of the upcoming changes to take effect as of 2023, the aim of this article is to elucidate the changes related to operational risk capital in banks and to ascertain the weakest points in operational risk management. Therefore, this topic is timely relevant, as the aim of the research is to manifest the possible changes withing operational risk management in banks, by gathering and analysing empirical evidence. This article is based on academic research and professional experience. The methods used in the research are comparison, generalization and graphical illustration of statistical information, identification of the main idea of regulatory frameworks and legal documentation. The main results and findings of the research are that banks will need to rethink the strategies of their capital management and this article emphasizes the importance of a redesigned approach towards operational risk assessment in Basel III and substantiates the efficiency of the proposed framework. With Basel III, each loss may cause more challenges, as will be considered twice, as the direct impact on profit/loss and direct impact on future operational risk capital. Another finding is that the biggest amounts of losses are related to corporate items events and according