Operational risk capital under Pillar 1 of Basel III will soon again be measured with a standardised approach, leaving many banks that today are using the Advanced Measurement Approach (“AMA”) wondering what to do with their sophisticated operational risk models.

There is understandable reluctance to simply switch them off, bearing in mind the embedded investment not just in the models themselves but also in the risk management expertise that is established and sustained through their use.

The question then becomes how best to realise value from this investment:

(A) Should banks keep running the same AMA model and report the outputs under Pillar 2?

(B) If not, what model adjustments should be considered to repurpose AMA models for a Pillar 2 or internal usage?

(C) More widely, freed of the need to conform to specific prudential standards, how else should the capital modelling be adapted to strike a better balance between the requirements for measurement and those for management?

This article presents our views on these questions.

Option A: Keep running the same AMA model but report the outputs under Pillar 2 

Although banks will be required to use the new Standardised Measurement Approach (“SMA”) for Pillar 1 capital as prescribed by the Basel Framework, under Pillar 2 they still need to make their own internal assessment of capital needs. Since they are free, in principle, to choose their own modelling approach for this, they might be tempted to use their existing AMA model without change for Pillar 2. However, we take a cautious view on this.

AMA models will no longer be used for Pillar 1 for a reason. As the Basel Committee puts it, "inherent complexity, and the lack of comparability arising from a wide range of internal modelling practices, have exacerbated variability in risk-weighted asset calculations, and eroded confidence in risk-weighted capital ratios”. Against this background, it is hard to see regulators having confidence in a Pillar 2 model whose methodology they have so explicitly undermined for the purposes of Pillar 1.

In any case, there is a clear expectation that internal models and modelling processes should pass the management ‘use test’. Therein lies the greatest obstacle: most legacy AMA models we have seen were built with more emphasis on statistical techniques than on business usefulness. The emphasis is on measurement, rather than management, and their resulting computational complexity means that they are cumbersome management tools, at best.

To give a flavour of the typical complexity, and considering the single subject of tail extrapolation, the table below shows the range of modelling choices made by four representative banks.

Tail Extrapolation Component
Bank A
Bank B
Bank C
Bank D
External Loss Data: Distributions
Log Normal, Pareto
Log Normal, Weibull, Generalised Pareto (“GPD”), Burr
Log Normal, Log Logistic, Pareto, Burr
Sub-exponential probability distributions
External Loss Data: Estimation Methods
Maximum Likelihood Estimation
Maximum Likelihood Estimation

Quantile Distance Estimation

Quantile Matching Estimation
Maximum Likelihood Estimation
Maximum Likelihood Estimation
External Loss Data:
Tail Attachment Point
0.50, 0.55, .., 0.90 and 0.91, .., 0.99 percentiles
0.60, 0.65, 0.70, 0.75, 0.80 percentiles
No truncated distribution in use
0.90, 0.92, 0.94 percentiles and losses that leave 10 and 3 events in the tail
External Loss Data: Distribution Selection
Quadratic Class Upper Tail AD Statistic (“AD2”) value

Quadratic Class Upper Tail AD Statistic (“AD2”) p-value (>0.05)

Number of points in the tail (minimum 20)

Check on the conditions for the existence of the first moment performed post Scenario Data inclusion
AD Statistic p-value (>0.05)

AD Statistic value

Conditions for existence of the first moment and for appropriateness of tail parameter

Number of points in the tail (minimum 30)
Quadratic Class Upper Tail AD Statistic (“AD2”), or Supremum Class Upper Tail AD Statistic (“ADU”), or KS Statistic p-value (>0.05)

SBC value

Conditions for the existence of the first moment

Data sufficiency, measured by RSE
KS Statistic, or Cramer–von Mises Statistic, or AD Statistic p-value (>0.05)

AIC value
Conditions for existence of the first moment and for appropriateness of tail parameter

Number of points in the tail
Scenario Data: Combined Distribution
Bayesian inference to incorporate Scenario Data into the tail distribution

Estimation of the prior distribution and estimation of the final tail parameter from the associated posterior distribution
Scenario Data used to rescale the External Loss Data distribution for each Operational Risk Category (“ORC”)
Distribution type obtained from the External Loss Data fitting process, together with the estimation of the shape

The scale parameter is calculated from Scenario Data
For each Operational Risk Category (“ORC”), External Loss Data is converted into a scenario and the tail distribution is fitted to all Scenario Data


Explaining a period-on-period capital movement from such a model or benchmarking it to the industry average, in a way that could meaningfully inform an appropriate management response, would be challenging to say the least.

Another management point being considered by a few global systemically important banks (“GSIBs”), is that the AMA model outputs are inefficient for granular levels of financial resource allocation. In response, some GSIBs have intentionally left out AMA Operational Risk Capital as part of their deal-level Return on Tangible Equity (“ROTE”) frameworks.

In essence, we believe that operational risk models for Pillar 2 purposes need to strike a balance between measurement and management, while most legacy AMA models will find it hard to achieve this.

Option B: consider model adjustments to repurpose AMA models for a Pillar 2 or internal usage

We believe the main priority when adapting AMA models from Pillar 1 to Pillar 2 should be on simplification. This is critical to enabling management to recognise the factors driving operational risk capital requirements, and how these – as well as expected operational losses – would respond to changes in operations and operational controls. From our experience, opportunities for simplification include:

  • Reducing direct reliance upon historical and industry-wide loss data, itself often not particularly representative of a bank’s risk profile;
  • Increasing reliance on the outcomes of scenario analysis, and engaging the expert forward-looking views of risk management and front-line business professionals in conducting such analysis;
  • Continuing to distinguish between the frequency and severity of risk events, but fitting appropriate risk distribution structures (such as Poisson and Lognormal) to these events in a simpler, more stable and more consistent manner that managers can readily process and comprehend; and
  • Reducing the complexity of the approaches to model the tail severity distribution by focusing on a narrower set of possible distributions and goodness-of-fit tests.

Such simplifications can have numerous advantages including:

  • Improving explainability, thus allowing the business to understand the actions they can take to impact the operational risk capital profile;
  • Supporting investment decisions so that operational changes and new controls can reduce the incidence and / or severity of operational losses as well as capital requirements;
  • Shifting the balance from loss data driven analysis (which is inherently backward-looking) to forward-looking scenario analysis including horizon scanning for emerging risks; and
  • Greater accounting for the effect of the bank’s own risk management practices upon the risk profile.

In doing so, firms can move closer to a practical working partnership between the independent risk function and front-line businesses, with the joint aim of maximising long-run, sustainable, shareholder and stakeholder value.

While some components of the AMA should not be modified as the model changes from Pillar 1 to Pillar 2, other components should be retained and built upon. We believe there are three key characteristics of AMA models that should remain at the core of any operational risk capital model.

1. Scenario Data
Traditionally used as a way to fill gaps in historical loss data, synthetic, scenario-based loss ‘events’ have taken on a crucial importance in building a comprehensive picture of operational risk. Indeed, scenarios are typically now the main drivers of the tail severity distribution (and therefore of the capital estimates themselves) as well as being the main way to incorporate a forward-looking view. Furthermore, beyond helping to quantify risk and capital, scenario-based modelling opens the door to a potentially much richer world of interactive / interrogative risk analysis (including stress and reverse-stress testing) which, in turn, helps to indicate how such scenarios can be anticipated and managed pre-emptively.

2. Independence of Frequency and Severity in the Loss Aggregation Approach
One of the key assumptions of AMA is that the frequency of operational risk events (such as control failures, external events or acts of malpractice), and the severity of losses incurred as a result, are each subject to their own uncertainties which should therefore be modelled independently of one another, prior to aggregation. This assumption forms the basis of the Loss Distribution Approach (“LDA” - the most common approach for modelling operational risk losses) and allows for a more robust and meaningful analysis and understanding of the spectrum of low-frequency-high-severity to high-frequency-low-severity events, as well as their respective determinants. Following the general theme of this article, this approach aids both the robust measurement and effective management of operational risk.

3. Correlation Structure
Under AMA, firms are permitted to incorporate dependence effects into operational risk capital models, to account for diversification. Whether through copulas or a variance-covariance approach, dependence modelling enables diversification benefits to be estimated, typically reaching up to 40% of gross undiversified losses (sometimes higher for US GSIBs). Crucially, this also enables the implications of extreme yet plausible combinations of operational loss events (notwithstanding diversification) to be examined, as an input to a variety of risk management exercises and protocols such as stress tests, risk appetite setting, disaster recovery planning and so on.

An important issue to recognise is that the process of adapting AMA models from Pillar 1 to Pillar 2 models cannot follow a set path. This is due to the complexity of legacy AMA models, together with the need for practical risk management utility in the Pillar 2 space and the differences in banks’ operational control frameworks. Therefore, in our experience, the process needs to be worked through carefully, case by case.

If there is a single guiding principle it is this:

“When the goal is to produce a model that helps the bank to make better informed business decisions, firms need to invest in understanding the fundamental components of AMA and their business significance, by reference to the three building blocks described above plus whatever other features management wish to retain or build in.”

As a final cautionary note, having worked with several firms to help adapt and simplify their AMA models, we know it can be easier said than done. The interpretation through a business lens of some of the drivers of even a dramatically simplified AMA model can still pose significant challenges, which can be exacerbated when changes are required. In other words, creating a simple but effective solution, from a complex starting point, for a complex business, is itself a complex process!

Option C: What about a blank canvas approach?

Many firms never had an AMA model, perhaps put off by the high data and technology requirements, and the costs of development, validation and ongoing regulatory approval. Other firms may have been AMA-approved, but nonetheless prefer the idea of a clean start in creating a purpose-designed Pillar 2 solution than adapting their AMA model. Either way, we recommend the Scenario-based LDA approach for Pillar 2 Operational Risk modelling. See our blog here for our reasons why.

A scenario-based LDA approach puts emphasis on the quality of the scenario workshops and the rationale of scenario estimates, which in general are the most sensitive drivers of the capital measure. In our experience, this approach can be relied on to yield an intuitive, management-friendly outcome, and – crucially – one that is credible in the eyes of regulators; particularly in satisfying the use test.

As a bonus, the scenario-based LDA approach yields genuine business / risk management insights, since there is value in the process of building, populating and running the model (involving business people as much as technical risk people) as well as in the model itself and its outputs.

To demonstrate that we put our money where our mouth is, we have built an intuitive operational risk Pillar 2 modelling framework into a hosted-web-service solution called Capital Clarity, which could provide a starting point - or even the end point - if a blank canvas approach is called for.

Want to know more? 

We have worked extensively with our clients through their iterative development of the operational risk capabilities over the last 20 years. In the last two years alone, we have run operational risk model validations; reviewed and enhanced operational risk frameworks and policies; facilitated management workshops with over 50 CROs, heads of operational risk and front-line risk owners across a spectrum of financial institutions; and helped our clients transition to more effective risk functional structures and risk cultures, as operational risk extends to become non-financial risk.

For more information on any of the contents of this article please contact Raymond Zhu or Jan-Hinnerk Fahrenkamp.

See some of our other blogs on the wider capital modelling topics:

The New Age of Basel Pillar 2 Economic Capital Modelling

Modern Operational Risk Modelling

Managing Credit Portfolios


Partner Sponsors:             

Paul Freeman        Stephen Lucas      

Damian Hales        Ian Wilson