ch&co_sma review_op risk comments and suggestions

24
June 2016 Standardised Measurement Approach for operational risk (BCBS Consultative Document, March 2016) Comments, proposals and open questions Benoît Genest – [email protected] Hélène Fréon – [email protected]

Upload: helene-freon

Post on 18-Feb-2017

186 views

Category:

Economy & Finance


0 download

TRANSCRIPT

Page 1: CH&CO_SMA review_Op Risk comments and suggestions

June 2016

Standardised Measurement Approach for operational risk(BCBS Consultative Document, March 2016)

Comments, proposals and open questions

Benoît Genest – [email protected]élène Fréon – [email protected]

Page 2: CH&CO_SMA review_Op Risk comments and suggestions

2GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Table of contents

1 Introduction on the review of the SMA methodology based on theoretical profiles

• Drivers of the SMA capital variations over time for a given loss profile

• Risk sensitivity of the SMA methodology through 4 theoretical profiles

3 Detailed solutions to be discussed

4 Outstanding Issues

2 Analysis of the Consultative Document of March 2015

Page 3: CH&CO_SMA review_Op Risk comments and suggestions

3GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Evolution of the SMA Capital considering different BI scenarios

Annual statistics of the sample• Ca. 6 000 loss events < 10 M€ per year• 10 M€ < 2 to 5 loss events < 100 M€ per year• Annual average loss amount = 23 000 €

16-year observation period• LC, 16-year considering the loss profile is similar• BI = 9 500 M€ (2006-2015)• Simulated BI considering the scenarios (2016-2021)

Description of the theoretical initial loss profile• The loss profile is described in the adjacent illustration. It

is smiliar across the 16-year period• Small loss events (<10 M€) have a higher frequency

compared to severe events (>10 M€)

BI scenarios(1) Constant BI: assumes the BI stagnates to 9 500 M€ over

time (16 years).(2) Variable BI: considers the BI is constant before 2016 (9

500 M€) and either increases or decreases from 2016on, assuming a constant annual variation (+/- 200 M€)in both cases between 2016 and 2021.

Analysis of the impacts per scenario(1) Considering a constant BI, SMA capital stagnates over

the 15-year simulated period.(2) Considering a variable BI, SMA capital is estimated

between 1460 M€ and 1600 M€ depending on the BIscenario (increase/decrease) representing a 15% of theBI level. The BI evolution have a strong impact on thegrowth rate of the SMA capital over time, for thisspecific profile.

Drivers of the SMA capital variations over time for a given loss profile (1/2)Impact of BI variations assuming various theoretical scenarios

The SMA methodology is highly dependent on the ratioLC/BIC which defines the growth rate of the SMA capitalvalue over time.

Key learning

Agg

rega

ted

loss

am

ou

nts

(in

M€

)

2016 2017 2018 2019 2020 2021

1460 M €

1480 M €

SM

A c

ap

ita

l (i

n M

€)

2016 2017 2018 2019 2020 2021

+10%

-7%

6-year simulated profiles (starting in 2016)

Constant BI(In 2016, BI = 9 500 M€)

Variable BI(annual var. +/- 200 M€)

1 2

Initial theoretical loss profile(identical on the period)

10 M 100 M 0 M

1320

690

The loss profile is similar across the considered 16-year period

Page 4: CH&CO_SMA review_Op Risk comments and suggestions

4GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

2016 2017 2018 2019 2020 2021

2016 2017 2018 2019 2020 2021

SMA

cap

ital

(in

M€

)SM

A c

apit

al (

in M

€)

Compared effects | Inclusion of a 100 M€ loss vs. inclusion of 10 10 M€ losses in 2016 loss data history

2016 loss data history | Inclusion of a 100 M€ loss(Shock in severity, average variations across scenarios in 2016 and 2021)

2016 loss data history | Inclusion of ten 10 M€ losses(Shock in frequency, average variations across scenarios in 2016 and 2021)

1

2

• Following the previous simulations scenarios, theadjacent illustration compares the effect of twostresses applied in the 2016 to the loss data history

One shock in severity (stress 1) One shock in frequency (stress 2)

• SMA Capital variations depend on: a shock inseverity increases SMA Capital on 2016 by 15% and12% in 2021 (average variations between and profilesacross the 3 BI scenarios). Whereas, a shock infrequency keeps on the SMA capital at a similarlevel (-2% on average).

• SMA Capital growth rate is significantly attenuatedwhen BI decreases or stagnates over time.

Drivers of the SMA capital variations over time for a given loss profile (2/2)Impact of loss distribution: stresses applied to loss frequency and severity

(1) The SMA methodology increases the capitalrequirements when losses are extreme (> 100 M€).Indeed, for the same aggregated amount of loss addedto the loss data history (100 M*1 or 10 M*10), theimpact on the SMA capital depends on its distribution.

(2) A severe loss is taken into account in the LC for 10years and is weighted 19 times in the LC calculus.Thus its impact is amplified and supported over along time. In the latter case (occurrence of anextreme loss), the only lever to reduce SMA Capital for10 years is to have a decreasing BI during the sameperiod.

Key learnings

Stress 1 profiles

Initial profiles

Stable BI

1460 M €

1650 M €

- 1,5%

- 2%

Stress 2 profiles

Initial profiles

+12%

- 12%

+15%

Page 5: CH&CO_SMA review_Op Risk comments and suggestions

5GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

4 cases have been simulated below with different distribution profiles to illustrate the sensitivity of the reviewed SMA to the specificities of distributionprofiles (using theoretical hypothesis with extreme losses to illustrate a profile that replicates the same distribution over time)

Risk sensitivity of the SMA methodology through 4 theoretical profiles (1/2)Description of the simulated cases

Agr

egat

ed a

mo

un

ts o

f lo

sses

( in

M€

)

898

10 M 100 M 0 M

Loss distribution profile under aleptokurtic probability distribution:the most impacting losses are lessfrequent than the most severe ones

Statistics of the sample used :• 2 000 losses < 10 M€• Average = 0,449 M€• Std. Deviation = 388 K€

10 M 100 M 0 M

655

Distortion of case 1: inclusion of Lossclass 2 events (> 10 M€) with aconcentration on Loss class 1 events.

Statistics of the sample used :• 38 losses > 10 M€• 1962 losses < 10 M€• Average = 0,744 M€• Std. Deviation = 2,350 M€

832

896

10 M 100 M 0 M

2 583 Case 1 including an extreme loss (2583 M€) generating a fat tail

Statistics of the sample used :• 1999 losses < 10 M€• 1 loss > 100 M€• Average = 1,740 M€• Std. Deviation = 57,8 M€

CASE 1 | Loss distribution profile with no tail CASE 2 | Distortion of case 1 with no extreme losses

CASE 3 | Presence of a « fat tail »

-12%

832

10 M 100 M 0 M

540

Mix cases 2 & 3 where the majorityof loss events are observed withinLoss Class 1 when looking at the 2000 simulated loss events

Statistics of the sample used :• 1962 losses< 10 M€• 33 losses> 10 M€• 5 losses> 100 M€• Average = 1,260 M€• Std. Deviation = 11,7 M€

1 149

CASE 4 | Mix of cases 2 and 3

Agr

egat

ed a

mo

un

ts o

f lo

sses

( in

M€

)

Agr

egat

ed a

mo

un

ts o

f lo

sses

( in

M€

)

Agr

egat

ed a

mo

un

ts o

f lo

sses

( in

M€

)

Page 6: CH&CO_SMA review_Op Risk comments and suggestions

CHAPPUIS HALDER & CO. 66GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Risk sensitivity of the SMA methodology through 4 theoretical profiles (2/2)Evolution of the SMA capital

SM

A c

ap

ital (M

€)

Bu

cket

1B

uck

et 2

Bu

cket

3

Bu

cket

4B

uck

et5

-79,9%

-82,8%

BI (M€)

0,2% 0,05%LC/BIC

30 00010 000100

6 625

12 100

10

Bu

cket

1B

uck

et 2

Bu

cket

3

Bu

cket

4B

uck

et5

-73,9%

-80,2%

BI (M€)

7,3% 2%LC/BIC

30 00010 000100

6 700

12 100

Bu

cket

1B

uck

et 2

Bu

cket

3

Bu

cket

4B

uck

et5

BI (M€)30 00010 000100

12 40012 100

743% 204% 106,5%LC/BIC

+37,3%

+59%

+12,3%

SM

A c

ap

ital (M

€)

SM

A c

ap

ital (M

€)

Bu

cket

1B

uck

et 2

Bu

cket

3

Bu

cket

4B

uck

et5

BI (M€)30 00010 000100

7 340

12 100

74% 20% 1%LC/BIC

-36,9%

-60%

10

10 10

SM

A c

ap

ital (M

€)

100%

50 000 50 000

50 000 50 000

CASE 1 | LC = 3,14 M€ CASE 2 | LC = 126 M€

CASE 3 | LC = 12 930 M€ CASE 4 | LC = 1 272 M€

Page 7: CH&CO_SMA review_Op Risk comments and suggestions

7GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Table of contents

1 Introduction on the review of the SMA methodology based on theoretical profiles

3 Detailed solutions to be discussed

4 Outstanding Issues

2 Analysis of the Consultative Document of March 2015

• Starting assumption | Definition of a prevailing reference date

• Limits and proposals to BCBS’ questions (Q2, Q3)

Page 8: CH&CO_SMA review_Op Risk comments and suggestions

8GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Two types of loss events are considered when calculating the Loss Component (LC)

• Provisions on expected loss generating events

• Observed losses generated by Operational Risk events

The Basel Committee indicates each loss – whether observed or provisioned – shall be registered with occurrence, accounting and reporting dates.(cf.§43 reference date bullet point 5).

Consequence : bias in the Loss Component calculus

When taking loss events into account in the LC calculus, the reference date depends upon the type of loss event :

• For provisions : the Basel Committee mentions the accounting date (cf.§ 45 reference date bullet point 1)

• For losses on observed events : the bank is free to choose either the accounting or the reporting date. (cf.§ 45 reference date bullet point 2)

This open-ended choice in the reference date generates skews the LC computation, for the chosen reference date will necessarily vary across banks.

Why should banks be allowed to defined their reference date when taking into account loss events (excluding provisions on expected losses) in the LC computation ?

In the Consultative Document, the Basel Committee specifically signalled its willingness to promote transparency through a standardised and homogenousframework for Operational Risk Measurement across European financial institutions. Therefore the prevailing type of reference date should be clearlyspecified and applicable to all banks for any type of loss events.

As the accounting date is mentioned as relevant for provisions, this shall be prevailing for all eligible loss events to the LC computation registered in the datahistory.

Definition of a standard reference date (CH&Co suggestion)

In the following slides, CH&Co assumes

the reference date is the accounting date for any type of loss events

What are the limits?

Starting assumption | Definition of a prevailing reference date

Page 9: CH&CO_SMA review_Op Risk comments and suggestions

9GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Q2. What are respondents’ views on the inclusion of loss data into the SMA? Are there any modifications that the Committee should consider that would improve the

methodology?

Limits and proposals to BCBS’ questions (Q2, Q3)Reminder of questions 2 et 3

Q3. What are respondents’ views on this example of an alternative method to enhance the stability of the SMA methodology? Are there other alternatives that the

Committee should consider?

Page 10: CH&CO_SMA review_Op Risk comments and suggestions

10GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Qualitative asymmetry (considering a 10-year loss history)

• Data collection standards (in terms of quality and completeness) improved and will vary between t-10 and t : banks should get to benefitfrom a “learning effect” over time, enhancing their data collection and treatments standards

• When computing loss events across a 10-year history, the LC therefore aggregates losses of variable quality.

History mismatch within the LC/BI ratio (as defined by BCBS§23 bullet point 1)

• There is a mismatch between the depth of the LC history (5 to 10 years) and the BIC history (3 years)

• Thus, the ratio is skewed.

Flexibility to define the depth of the loss data history – 5 to 10 years depending on available loss data (cf.§43 bullet point 1)

• This might leave open the possibility of loss events arbitrage, especially in the case of remote historical events (> 5 years).

• A 5-year loss history is considered less illustrative of the loss distribution profile for a given bank compared to a 10-year one, however aloss happening 10 years ago might not be representative of the current bank’s profile

Limits

• BI data history: The reviewed definition of the BI computation will necessarily force banks to recalculate the BI pre-2016 to comply with the mandatory BIdepth. In practice, this can burden banks with additional computations.

• Loss data history: Some banks do not hold a 10-year internal data history (loss events), which also requires a significant effort on the long-term in terms ofboth completeness and quality of the data collection.

Proposals to the Basel Committee: Align both BI and LC history on a 10-year observation period

(1) Define a minimum required in terms of the history depth for a bank starting the SMA methodology : 5 years for both LC and BI history

(2) The ratio must match the histories taken into account whatever the depth chosen: 5 years minimum for the LC and BI, ideally 10 years. But, if a bank holdonly 5 years of data for the BI and 10 years for the LC, the logic of our proposal is to calculate the ratio with 5 years for each component.

(3) Set up a transitional period (following BCBS’ suggestion on the LC history in§43 bullet point 1) for banks which do not hold enough data, then to fill thisgap it is recommended to use additional information through stress-tests on external data in order to complete the 10 years required.

(4) Complete the internal data base with additional data : especially losses events with a probability of occurrence comprised between 5 and 10 years. 2 typesof data might be used : simulated losses (via internal scenarios – EBA stress tests for example) or external data sourced from external database.

Proposal |Align both BI and LC observation period to provide for a long-term and sound standard

What are the limits?

Limits and propositions on Q2 (1/3)Align the depth of both BI and LC histories to provide for a long-term and sound standard

Page 11: CH&CO_SMA review_Op Risk comments and suggestions

11GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

What are the limits?

Open-ended definition of a de-minimis threshold for internal loss data collection (mandatory, defined by each bank,§43 bullet point 4)

The Basel Committee assumes that a threshold shall be applied when collecting loss data to guarantee the soundness of internal data.Though, this de-minims gross loss threshold cannot exceed 10 K€ for banks with an existing loss data history.

How to define this de-minimis threshold ?

The inappropriate calibration of the threshold might affect the quality of data collection:

• An exceedingly low threshold might constrain to collect marginal/irrelevant incidents and might limit the OR system efficiency (time-consuming collection)

• An exceedingly high threshold can prejudice the efficiency of the OR framework in neglecting important risk situations and/or area

Proposals to the Basel Committee

(1) Materiality : Consideration of a materiality threshold which has to illustrate the loss distribution and the bank’s risk appetite according to its differentbusiness activities.

(2) Internal calibration (by banks) : Banks should define their own threshold(s) considering their risk appetite and their business activities.

However, in order to stabilize the calibration methodology, BCBS has to insist on the:

• Homogenization of the thresholds’ definition and distinction of local thresholds (per activity) and aggregated threshold for the bank’s mitigation as a group

Based on the diversification of the banks’ activities/portfolios, it is recommended to use a threshold adjusted to each business line in order to mitigatethe risk locally and to reflect the loss profile related.

The capital requirements calculus in terms of the SMA methodology will then restrain the losses counting to a certain threshold calibrated by the bankas a group.

Proposal | Define and adapt reporting thresholds per type of bank/activity

Limits and propositions on Q2 (2/3)Define indicative reporting thresholds, to be defined per type of bank/activity

Page 12: CH&CO_SMA review_Op Risk comments and suggestions

12GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

What are the limits? What are the suggested solutions?

Lack of granularity in the loss classes and absence of lossesfrequency

• The LC is a weighted sum of the empirical averages of losses onobserved events (based on the loss classes developed above)

• The weight (3) depends on the losses’ amounts and arecalibrated by the Basel Committee through the QIS 2015 (cf.formula LC Component§35)

• The 3 levels (defined as loss classes) are lacking in granularity.The gaps between the loss classes are significant as they put ata same stage different levels of losses

The variations in the losses profiles are not considered in the LCcalculus:

The weight coefficients are point in time on the QIS 2015, no futureadjustment is mentioned by the Basel Committee

Correction of the LC formula to precise the losses weighting

(cf. solution 2)

• Addition of new Loss Classes: the LC formula has to close the gaps byincluding more levels for losses values especially for Loss Class 1 & 2

• Periodic review of the weighting by the Basel Committee, based on :

The variation of the systemic and idiosyncratic risks for the period oftime considered

The evolution of loss distributions collected by the Basel Committeethrough the previous QIS and the variation of risk sensitivity

Review of the Loss

Component computation

Inclusion of 10 years of loss history in LC

• Losses are not discounted over time

• The difference in the underlying loss: based on the type of loss,the amortization rate is different. For example, a technicalequipment claim versus a cash amount loss cannot be capitalizethe same way

Capitalization of losses considered as an opportunity cost, using adiscount factor explained by: (cf. solution 1)

• Risk Free rate + Internal rate of return: with the assumption that thereference date is the date of accounting of the loss

• Exchange rate: if the losses considered are not in euros (with the sameassumption on the reference date)

Aggregation of each loss amount to a specific annual depreciation rateadapted to its underlying:• Losses are considered as a decreasing sequence over time as they are

weighted by their occurrence date

Losses accounting through the

bank’s history

CH&Co AssumptionIn the rest of this document, 3 loss classes are considered and defined in the CD as follows: • Loss class 1 : losses on observed events with amounts lower than 10M€• Loss class 2 : losses on observed events with amounts between 10M€ and 100M€• Loss class 3 : losses on observed events with amounts higher than 100M€

Limits and propositions on Q2 (3/3)Inclusion of internal data in the LC calculus

Page 13: CH&CO_SMA review_Op Risk comments and suggestions

13GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Efficiency of the OR system not properly accounted

We consider that an Operational Risk system improves itseffectiveness if it is able to:

• Adapt/Adjust its management of the risks regarding theprevious events especially in the prevention and mitigationprocesses which represents heavy investments for banks

• Project and control over time the evolution of the differentcomponents used in the SMA methodology (BI and LC)

Backward-looking vision only

• The SMA methodology include exclusively past losses andincomes

• The method is not future-oriented in terms of the BI (GrossIncome projections are not included)

Weak considerations of the systemic risk’s effects

• The model considers only the idiosyncratic part of the risk as itis based on the internal data of the bank

• The lack of external data inclusion and stress-tests based onspecific scenarios is an issue for on optimum understanding ofthe variations of the systemic risk over time

Introduction of a « rewarding » dimension based on the OR systemefficiency (cf. solution 3.1)

The reward should be adjusted on the bank’s ability to control its:

• Operational Risk exposure (observed events from Loss Class 1 over thelast 3 years)

• Activities’ volume (BI computed over the last 3 year)

Introduction of a forward-looking component which takes into accountthe variations of the OR exposure over time (cf. solution 3.2)

This component should be calibrated regarding:

• The projections of the expected losses (EL) and BI for 3 years based onthe expected variation of the OR exposure and the internal/externaldata (through stress-test scenarios)

• The difference between the observed and the expected values (back-testing)

Review of the SMA capital

calculus

What are the limits? What are the suggested solutions?

Limits and solutions proposed for Q3 (1/2)Improvements proposed to review the SMA formula

Page 14: CH&CO_SMA review_Op Risk comments and suggestions

14GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Limits and solutions proposed for Q3 (2/2)Diagnosis of the alternative method (calibration of the m factor)

Presentation of the alternative method: ILM calculus

• The Committee suggests in the consultative document tocalculate the Internal Loss Multiplier through an alternativemethod:

ILM =𝑚𝐿𝐶+ 𝑚−1 𝐵𝐼𝐶

𝐿𝐶+ 2𝑚−2 𝐵𝐼𝐶m

Where m is a factor to be calibrated

• L’ILM is a multiplier illustrating the evolution of internal lossescompared to the BI, its variation goes hand-in-hand with thecapital requirements evolution over time

• The alternative function presented above has to verify the sameproperties than the logarithmic function: the growth behaviourfor an increasing ratio LC/BIC and the positive values. If we usetheses conditions for the alternative method, we conclude thatthe m factor has to be greater than 1.

0

0,5

1

1,5

2

2,5

500

1000

1500

2000

2500

3000

3500

0

50

0

10

00

15

00

20

00

25

00

30

00

35

00

40

00

45

00

50

00

55

00

60

00

65

00

70

00

75

00

80

00

85

00

90

00

95

00

10

00

0

Inte

rnal

Loss

Mu

ltip

lier

SMA

cap

ital

(in

M€

)

Loss Component (in M€)

Alternative : m = 1,2Alternative : m = 1,5

BI = 9 500 M€

LC → +∞

The alternative aims at replacing the SMA methodology for banks with severe andlow occurrence probability losses in their history (especially for the Loss class 3 :amounts above 100 M€) (cf. solution 4)

The idea developed by the Basel Committee(annexe 2 CD) :• Regarding the weighting for losses > 100 M€ and the accounting of 10 years

internal losses, the SMA methodology penalizes hardly banks which had intheir history an extreme and infrequent loss.

• Thus, in this cases, the alternative restrains the evolution of the capitalrequirements by delimiting the ILM to an m level.

Comparison of the alternative method and the classic SMA methodologyMain assets of the alternative method during stress periods of time (occurrenceof extreme losses > 100 M€)• Stabilization of the impacts on capital requirements during an extreme loss

shock for a given financial market• For a bank, the alternative method allows a limitation of the capital

requirements volatility in case of the occurrence of severe losses and ability tohedge all the potential events

Main drawback of the alternative method :• The SMA methodology through this alternative is more conservative than the

classic SMA for a bank profile where the ratio LC/BIC is under 1• Example : For a BI equal to 9 500 M€, and LC < 1 750 M€,

SMA capital alternative method > SMA capital classic method

The m factor has to be calibrated according to:• The ability of the SMA formula to cover all the potential operational

risks which the bank is expose to• The insurance of a stable financial market by minimizing the variations

between banks in case of extreme losses

Page 15: CH&CO_SMA review_Op Risk comments and suggestions

15GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Table of contents

3 Outstanding Issues

2 Analysis of the Consultative Document of March 2015

2Detailed solutions to be discussed

• Detailed solution 1 | Losses accounting through the bank’s history

• Detailed solution 2 | Review of the Loss Component computation

• Detailed solution 3 | Review of the SMA capital formula

• Detailed Solution 4 | Proposal of a calibration for the m factor

1 Introduction on the review of the SMA methodology based on theoretical profiles

Page 16: CH&CO_SMA review_Op Risk comments and suggestions

16GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Capitalisation Methodology of losses| The loss is considered as an “opportunity cost”

𝐿𝑜𝑠𝑠 𝑡0 = 𝐿𝑜𝑠𝑠 𝑡 ∗ 𝐹𝑋𝑟(𝑡0) ∗ 𝐷𝐹 𝑡0, 𝑡

Proposal: The operational risk loss is considered as an opportunity cost in other words it’s the estimation of missing opportunities to invest in the bank’s businesses. This cost isthen capitalized through a coefficient composed of:

• Discount factor: 𝐷𝐹 𝑡0, 𝑡 = 𝑒𝑅 𝑡0,𝑡 ∗(𝑡0−𝑡), 𝑡0 > 𝑡

• Risk free rate (OIS) and internal rate of return (IRR) : R 𝑡0, 𝑡 = IRR 𝑡0, 𝑡 +OIS 𝑡0, 𝑡

• Exchange Rate: 𝐹𝑋𝑟(𝑡0) = 1, 𝑖𝑓 𝑡ℎ𝑒 𝑙𝑜𝑠𝑠 𝑖𝑠 𝑒𝑥𝑝𝑟𝑒𝑠𝑠𝑒𝑑 𝑖𝑛 𝑒𝑢𝑟𝑜

𝑟 𝑡0 = 𝐸𝑥𝑐ℎ𝑎𝑛𝑔𝑒 𝑟𝑎𝑡𝑒 𝑡0 , 𝑖𝑓 𝑛𝑜𝑡

Limit identified: The discount rate of losses amounts depends on each type of loss considered as the amortisation rate of the losses is based on the underlying considered(example : equipment loss vs. cash loss)

Proposals to the Basel Committee:

• We suggest to consider the losses history as a decreasing arithmetic-geometrical sequence over time and to adjust then the averages calculated in the Loss Component

• Calibration of the coefficients for each business lines the loss considered is related to

Main benefit

The time weighting provides a stabilization of the LC as it avoids potential jumps in the formula due to the rolling history (if the outgoing amount’s loss is greater than theupcoming one) knowing that the older losses are less weighted over time

𝑡0 : date of actualization (CH&Co assumption : latest quarterly closing date)𝑡 : loss reference date, (CH&Co assumption : accounting date)t0 > t

For a given loss in t0 :

Detailed solution 1 | Losses accounting through the bank’s history

Page 17: CH&CO_SMA review_Op Risk comments and suggestions

17GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Adding of new loss classes to precise the losses weighting

Proposal:

• Calibration by the Basel Committee of new intermediate levels (Loss Classes)based on the data collected through the QIS

• The gaps between the different Loss classes have also to be correlated with thelosses’ amounts which means that the borders’ classes will be exponentiallyincreasing for higher amounts.

Proposals to the Basel Committee:

The Basel Committee has to set, according to the risk sensitivity and the dataavailable in the previous QIS, an optimum number of classes with more levels forlower amounts, especially in Loss classes 1 and 2 (cf. illustration).

Illustration: for a given profile, the 3 Loss Classes mentioned in the consultativedocument (in orange) and CH&Co suggestion (in grey)

Proposal: Through the cycle Method (in grey + 𝛿 in orange)CH&Co suggests a review of the weighting coefficients regularly based on the evolution of the economic circumstances and the potential distortion of the losses profiles compared to the levels in 2015.• Review each 2/3 years by the Basel Committee of the coefficients through the adjustment of 𝛿• Calibration of each coefficient based on the Loss class related (3 classes here: Loss class 1 : [0;10M€ [, Loss class 2 :[10;100M€[ et Loss class 3: [100; +∞[ )Formula:• The adjustment explained above is illustrated by factors 𝛿𝑖 (i, the number of Loss class considered) • Where δ1, δ2, δ3 are positive or negative coefficients depending on the shape of the variations observed by the Basel Committee in the data collected

Loss Component 𝑡0 = (7 + 𝛿1) ∗ 𝐴𝑣𝑒𝑟𝑎𝑔𝑒 𝑡𝑜𝑡𝑎𝑙 𝐴𝑛𝑛𝑢𝑎𝑙 𝐿𝑜𝑠𝑠

+ (7 + 𝛿2) ∗ 𝐴𝑣𝑒𝑟𝑎𝑔e Total annual Loss only including loss events above > 10 M€

+ (5 + 𝛿3) ∗ 𝐴𝑣𝑒𝑟𝑎𝑔e Total annual Loss only including loss events above > 100 M€

New weighting of losses based on the economic circumstances

Methodology suggested by the Basel Committee: Point in time Method (in grey)• Description : The coefficients considered in the consultative document are calibrated in accordance with the situation collected by the BIS in 2015, which means that they

are point in time on the current QIS.• Bias: The distribution profiles depend on the variation of the systemic risk over time (volatility off the market), economic circumstances ... Thus, the point in time method

does not take into account these external time-depending factors

Detailed solution 2 | Review of the Loss Component computation

Fre

qu

en

cy

(lo

sses

occ

urr

ence

)

Severity (losses amount)10 M€ 100 M€0 M€

Page 18: CH&CO_SMA review_Op Risk comments and suggestions

18GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

SMA capital (t0) = 110 + 𝐵𝐼𝐶 − 110 ∗ 𝐼𝐿𝑀 ∗ (1 + 𝐶𝐴𝑅 𝑦−3;𝑦0 𝑡0 )y0, t0 respectively represent the year and corresponding date of computation (t0 should correspond to the end-of-year account closing date).y-1, y-2, y-3 respectively represent the considered 3-year observation period.

Methodology

• At the end of a given year y, each bank would consider the year-on-year variations of the 𝑶𝑳𝒚

𝟏

𝑩𝑰𝒚ratio over the past 3 years (i.e. from y0 to y-3). Then the bank would identify the

year-on-year trends of each variation over the past 3 years.

• From CH&Co’s premises, if all these variations are strictly negative over the past 3 years, then the bank should be rewarded on the SMA Capital at the end of y0. That is to say, if:

• The reward to be affected to the SMA capital should be proportional to 𝟏 + 𝑪𝑨𝑹 𝒚−𝟑;𝒚𝟎 𝒕𝟎 , the compound annual rate of the ratio over the past 3 years. This rewarding

factor would then take into account the speed of the 𝑶𝑳𝒚

𝟏

𝑩𝑰𝒚ratio over the past 3 years and proportionally reward the bank. We assume that

Proposal

• Description : Offering a reward (reducing the overall SMA Capital amount) for banks demonstrating a significant efficiency of their OR framework, CH&Co considers 3-year period should be sufficiently robust to illustrate this improvement. Theoretically, the efficiency of the OR framework is demonstrated by the capability of the bank to manage their OR exposure (losses observed from Loss class 1 ) in view of its volume of business (BI).

• Implementation : Inclusion of a decreasing indicator which is activated if and only if the ratio𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑 𝐿𝑜𝑠𝑠𝑒𝑠𝐶𝑙𝑎𝑠𝑠 1

𝑩𝑰has a decreasing tendency during the last 3 years

In what circumstances, the ratio is decreasing?

The improvements in the bank’s OR management and its risk profile are observed if :

• Situation 1: The revenues (BI) grew faster than losses (OL1) over the year.

• Situation 2: Losses decreased while the volume of activity stagnated or grew.

• Situation 3: In case of loss stagnation, a decreasing variation of the ratio is fully explained by a strong stimulation of the business revenues and volume.

Detailed solution 3.1 | Review of the SMA capital formulaEvolution of the OR framework efficiency

Formula

Page 19: CH&CO_SMA review_Op Risk comments and suggestions

19GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

SMA capital (t0) = 110 + BIC − 110 ∗ 𝐼𝐿𝑀 ∗ 𝑔 𝑂𝐿𝑦01 − 𝐸𝐿𝑦−1

1

𝑂𝐿𝑦01 are the observed losses from Loss Class 1 at the end of the year

𝐸𝐿𝑦−11 are the expected losses from Loss Class 1, estimated in y-1 for the next year y0

• The rewarding function g is activated if and only if the gap between the projections of Loss Class 1 and the observed values are comprised in a confidence interval calibrated by the Committee :

with 95% of confidence level and 𝜎 the standard deviation of 𝐸𝐿𝑦−11

Detailed solution 3.2 | Review of the SMA capital formulaQuality of the risk appetite projections

Methodology

Proposal

• Description : Introduction of a rewarding function depending on the measurement of the gaps between the projections and the real values of the Loss Class 1. Theprojections are based on internal and external data and scenarios, estimation of the systemic risk and GNP projections based on macroeconomic indexes

• Calibration of the rewarding annually

At the end of each year y, each bank would provide projections of their expected loss amounts belonging to Loss Class 1 for the year to come y+1 (𝐸𝐿𝑦+11 ). This

projection would then be compared at the end of y+1 to the observed losses from Loss Class 1 (𝑂𝐿𝑦+11 ).

Formula

Page 20: CH&CO_SMA review_Op Risk comments and suggestions

20GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Final combined proposal

SMA capital (t0) = 110 + BIC − 110 ∗ 𝐼𝐿𝑀 ∗ 𝛾1 ∗ ∗ 𝛾2 ∗1 2

Indexed rewarding factor on the quality of the risk appetite projections

Indexed rewarding on the efficiency of the OR framework

Solutions 3.1 et 3.2 combined | Review of the SMA capital formula

SMA capital (t0) = 110 + 𝐵𝐼𝐶 − 110 ∗ 𝐼𝐿𝑀 ∗ (1 + 𝐶𝐴𝑅 𝑦−3;𝑦0 𝑡0 )

𝑟𝑒𝑤𝑎𝑟𝑑𝑖𝑛𝑔 𝑓𝑎𝑐𝑡𝑜𝑟

y0, t0 respectively represent the year and corresponding date of computation (t0 should correspond to the end-of-year account closing date)y-1, y-2, y-3 respectively represent the considered 3-year observation period𝐶𝐴𝑅 𝑦−3;𝑦0 𝑡0 is the compound annual rate of the ratio over the past 3 years

1

SMA capital (t0) = 110 + BIC − 110 ∗ 𝐼𝐿𝑀 ∗ 𝑔 𝑂𝐿𝑦01 − 𝐸𝐿𝑦−1

1

𝑟𝑒𝑤𝑎𝑟𝑑𝑖𝑛𝑔 𝑓𝑎𝑐𝑡𝑜𝑟𝑂𝐿𝑦0

1 are the observed losses from Loss Class 1 at the end of the year

𝐸𝐿𝑦−11 are the expected losses from Loss Class 1, estimated in y-1 for the next year y0

g is a function to be calibrated by the Basel Committee

2

These 2 solutions remain independent, however it is possible to combine them as long as the indicators’ impacts on the SMA capital are limited based on the impact and sensitivity of each components added

Proposal to the Basel Committee

γ1,γ2 are weighting coefficients to be calibrated by the Committee in view of the impacts and sensitivity of each rewarding factor on the SMA capital.

Page 21: CH&CO_SMA review_Op Risk comments and suggestions

21GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Objective: Limitation on the capital requirements volatility for a given bank in case of severe losses and ability to hedge all the potential events

Illustration of the method proposed based on two simulated banks’ profiles

• Simulation of a shock between times t et t+1 : LC is doubled

• Estimation of the SMA capital in t (cf. points C1) and t+1 (cf. points C3) with an m factor which equalizes the alternatives of the ILM calculus

• Projection of the capital requirements in t+1 calculated through the alternative with the m level of time t (cf. point C2)

The difference between C1 and C3 is the adjustment that has to be considered in order to insure an efficient risk hedging

Proposals to the Basel Committee

• Adjustment of the m factor each year by the banks based on the simulations of losses with high severity and low occurrence probability

• Definition, by the Basel Committee, of the classic and extreme cases that have to be simulated by the banks in order to stress tests their parameters and validatethe m level. It is recommended to use external data and extreme scenarios (Robustesse Group for the French banks for example)

Solution 4 | Proposal of a calibration for the m factor (1/2)Stand-alone calibration | Considering a given bank

Illustration

• Simulation of two banks’ profiles, bank A and bank B:

BIA = 5000 M €; LCA = 1000 M €

BIB = 9500 M €; LCB = 2000 M €

• For each bank i (i = A or B) :

Point Ci; 1: Projection of the factor m related to the SMA capital considering the loss distribution profile in time

Point Ci; 2: Estimation of the capital requirements post-shock for the same m level as in time t

Point Ci; 3: Projection of the SMA capital post-shock with an adjustment of the m factor in order to equalize the initial and alternative ILM functions

• The capital requirements’ volatility of each bank isillustrated by the distance between mi;1 et mi;3

Fact

or

m

SMA capital (M€)

BIB = 9 500 M€BIA = 5 000 M€

∆𝑶𝑹𝑪𝑹𝑨 = 𝟐𝟏𝟖𝑴€

CA;1 CB;1CA;2

CA;3

Δ𝒎

𝑨=

0,2

7

CB;2

CB;3

𝜹𝑶𝑹𝑪𝑹𝑩 = 𝟒𝟕𝑴€∆𝑶𝑹𝑪𝑹𝑩 = 𝟒𝟖𝟓𝑴€

Δ𝒎

𝑩=

0,2

8

𝜹𝑶𝑹𝑪𝑹𝑨 = 𝟐𝟐𝑴€

ORCR: Operational Risk Capital Charge

Page 22: CH&CO_SMA review_Op Risk comments and suggestions

22GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Detailed solution 4 | Proposal of a calibration for the m factor (2/2)Global calibration | Considering banks across the European financial market

Objective: Homogenization of the m factor adjustments for similar banks by reducing the capital requirements’ variations in case of an extreme shock.

This proposal constitutes an extension of the previous one, once the m factor is calibrated by each bank

Proposals to the Basel Committee

• Simulation of severe shocks and scenarios of all the banks’ profiles collected by the Basel Committee

• Classification of the banks regarding their SMA capital sensitivity regarding the occurrence of an extreme loss (distance between points C1 and C3)

• Average interval [minf ; msup] for each bank group based on the variations observed to limit the clouds’ distortion

• Each bank in a given group, will calibrate its m factor too compute its capital requirements, depending on the interval allowed by the Committee

Fact

or

m

SMA capital (M€)

A given bank class before shock

𝒎𝒊𝒏𝒇;𝒎𝒔𝒖𝒑

Interval allowed for m factors

After shock

Methodology

Based on the previous scenario, we suggest a 3-step methodology:

• Step 1 | Clustering of the banks according to their stress sensitivity

Our proposal is based on the classification of banks per group consideringtheir SMA capital sensitivity for a similar scenario (LC is doubled in thiscase). This means that – in our proposal – the Committee would analyse, foreach bank, the distance between m factor before and after shock. Thegreater the distance, the higher the sensitivity, and vice versa.

• Step 2 | Analysis of the scatterplots (pre- and post-stress)

For each group, the Basel Committee will project:

A cloud of points C1 for each bank of a given group

A cloud of points C3 for each bank of a given group

• Step 3 | Definition of the m factor and associated confidence interval

The Committee calibrates the m factor and defines an optimum interval(maximum and minimum value of the m factor for a given group of banks).Each bank from the same group will have to respect the interval andprovide their data to calculate the m factor, so that the Committee canensure they stick to the required confidence interval.

Illustration

Page 23: CH&CO_SMA review_Op Risk comments and suggestions

23GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Table of contents

2 Analysis of the Consultative Document of March 2015

4 Outstanding Issues

3 Detailed solutions to be discussed

1 Introduction on the review of the SMA methodology based on theoretical profiles

Page 24: CH&CO_SMA review_Op Risk comments and suggestions

24GRA – Op Risk | Survey | SMA– June 2016 © Chappuis Halder & Co.| 2016 | All rights reserved

Why the SMA methodology omits the effects of the plans implemented for hedging the most severe losses? (e.g. reinsurance policies, hedgingproducts like cat-bonds) ?

Scope : Risks concerning natural disasters (fire/major flood), pandemics, terrorist attacks etc. which are handled by the banks with restricted flexibility intheir anticipation

• As it is difficult for banks to mitigate/contain these specific risks, shall we consider an accounting of the hedging costs in the loss data treatment (Gross lossafter recoveries and insurance costs) and the LC estimation ?

• This suggestion is also based on the current opportunity for AMA banks to reduce their OR capital charge to up to 20%. BCBS considered such policies had beneficial effects on the risk exposure, but also and most importantly, on the quality of the OR framework and risk assessment. (See BCBS 181, Recognising the risk mitigating impact of insurance in operational risk modelling, October 2010)

Outstanding IssuesQuestions addressed to be discussed

Why do both BI and LC include provision loss amounts?

How should provisioned loss events be considered? As a component of the revenues of the bank (and then included in the BI) or as part of theloss data history (and then included in the LC)?

Scope : Provisioned loss events

• Provisioned loss amounts are considered as an observed loss in the LC : they are weighted according to their severity

• Provisioned loss amounts in the BI : is part of « Other Operating Expenses » (OOE) in the Services Component

(cf. Annex 1, Services component bullet points 2 & 3)

provisions for losses on operational risk events

Costs for reserve funds used in the OR hedging of upcoming losses