risk management: bank and insurance

32
Giulio Laudani #32 Cod 20163 RISK MANAGEMENT Interest risk___________________________________________________________________________________________________1 Methods to Compute Interest Risk:__________________________________________________2 A proper Scheme to manage Internal Transfer Rate operation:________________________3 Credit risk_____________________________________________________________________________________________________4 The Expected Loss Estimates:_______________________________________________________4 The measure of the Unexpected Loss:________________________________________________7 Tools used to estimate UL:__________________________________________________________________8 Comments on the UL tools employed:__________________________________________________________9 Application and General comments on Credit Risk methods:___________________________9 Market risk__________________________________________________________________________________________________10 Tools used to estimate market risk________________________________________________10 Different applications of the Market risk model:__________________________________12 Measure of Volatility_______________________________________________________________________________________13 The general limits of the VAR approach are:_____________________________________________________________13 Basel Committee Framework:_____________________________________________________________________________15 Basel I:__________________________________________________________________________15 Basel committee II________________________________________________________________15 Insurance business_________________________________________________________________________________________17 Solvency II_______________________________________________________________________17 The formation process and directive structure:_____________________________________________18 Some example:______________________________________________________________________________19 Difference between Banks and Insurer______________________________________________20 A new risk paradigm_______________________________________________________________21 Problems related to the crisis:____________________________________________________________21 All these problems have caused:____________________________________________________________22 How the new RM activity looks like after the crisis:_______________________________________22 Interest risk The source of this risk is due to the transformation of maturities and the mismatch between asset and liabilities. Those situations produce an imbalance that will lead to: Refinancing risk whenever the maturity on asset is longer than liabilities one; Reinvesting risk when we are in the reverse situation; Changing in the demand/offer side of the demand of liabilities and call loans, hence the risk linked to the elasticity of demand curve (it won’t be study). Those imbalances are shown in all items affected by any change in interest rate, hence we should consider not only the trading book, but the broader one, that is the banking book, including any derivate whose value depends on market interest rate. The general rule on how treat this risk are explained within the Basel Committee framework, whose general principles have been proposed to 1

Upload: giulio-laudani

Post on 16-Apr-2017

500 views

Category:

Economy & Finance


2 download

TRANSCRIPT

Page 1: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163

RISK MANAGEMENTInterest risk_________________________________________________________________________________1

Methods to Compute Interest Risk:______________________________________________________________________2

A proper Scheme to manage Internal Transfer Rate operation:________________________________________________3

Credit risk__________________________________________________________________________________4

The Expected Loss Estimates:___________________________________________________________________________4

The measure of the Unexpected Loss:____________________________________________________________________7Tools used to estimate UL:_________________________________________________________________________________________8Comments on the UL tools employed:________________________________________________________________________________9

Application and General comments on Credit Risk methods:__________________________________________________9

Market risk________________________________________________________________________________10

Tools used to estimate market risk______________________________________________________________________10

Different applications of the Market risk model:___________________________________________________________12

Measure of Volatility________________________________________________________________________13

The general limits of the VAR approach are:______________________________________________________13

Basel Committee Framework:_________________________________________________________________15

Basel I:_____________________________________________________________________________________________15

Basel committee II___________________________________________________________________________________15

Insurance business__________________________________________________________________________17

Solvency II__________________________________________________________________________________________17The formation process and directive structure:________________________________________________________________________18Some example:_________________________________________________________________________________________________19

Difference between Banks and Insurer___________________________________________________________________20

A new risk paradigm__________________________________________________________________________________21Problems related to the crisis:_____________________________________________________________________________________21All these problems have caused:___________________________________________________________________________________22How the new RM activity looks like after the crisis:_____________________________________________________________________22

Interest risk

The source of this risk is due to the transformation of maturities and the mismatch between asset and liabilities. Those situations produce an imbalance that will lead to: Refinancing risk whenever the maturity on asset is longer than liabilities one; Reinvesting risk when we are in the reverse situation; Changing in the demand/offer side of the demand of liabilities and call loans, hence the risk linked to the elasticity of demand curve (it won’t be study).

Those imbalances are shown in all items affected by any change in interest rate, hence we should consider not only the trading book, but the broader one, that is the banking book, including any derivate whose value depends on market interest rate. The general rule on how treat this risk are explained within the Basel Committee framework, whose general principles have been proposed to help/offer guideline to national authority, which are in charge of the supervision of the bank institution, on how to best estimate and to best manage this risk. Here there is a list of the main area described in the Basel Committee framework:

Significant involvement of senior manager to overcome the traditional independency of risk management unit with the other operational division, this active rule of the senior management will grant uniformity on the criteria, objectiveness and proper procedure. This general provision is concretely declinated:

The board must be informed regularly and must approve the policy enhanced It must ensure that the risk management has the competence to deal with

1

Page 2: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 All the monitoring activity must be put in place and there must operational limits The bank must communicate to the supervisory and also to the public their own risk

Create independent unit which has to help the senior management in the decision process, providing technical point of view and avoiding potential conflict of interest

Bring the measurement of this risk at a consolidate level to adequately appraised and managed Integrate this risk measure into a day to day management of the bank, to steer the corporate policy in the way of how business is lead

Methods to Compute Interest Risk:

There exist several models to capture/measure this risk from the simplest to more advanced one, however each of them can provide a good guidance to Risk manager since they analyze different angle of the same problem:

Interest gap is an income based approach, it considers the effect of change in the market interest rate on the NI o There are several definition of Gap1 starting from the simplest to the most complicated/accurate, all of them depends on the

time windows considered, hence it is a time dependent measure The first is basically the difference between sensitive assets and liabilities, where by sensitive we mean all those asset

which are going to be re-price/roll over in the time frame considered The second one is named Maturity-Adjusted GAP2,which is a weighted average with weights the exposure of each

instruments within its GAP. This procedure is quite heavy demanding on the computational side, hence it is simplified by using the concept of marginal GAP3

The standardized gap has been introduce to overcame the assumption of same effect for each category included in the balance sheet. It consists of computing the beta of each categories

To take into account the change on demand for those items which haven’t an automatic indexing mechanism4, we will use the average delay to allocate the instrument to the properly marginal Gap windows

A similar approach can be used to compute the price and quantity interaction effecto The intrinsic limitation of this approach is the GAP is considering only the effect on the net income, hence it is ignoring the effect

on the market value of the fixed part of the balance sheet, whose value is effected by the way

Duration gap is an equity based approach. It is more consistent with the new market to market accounting principle. We are going to use the duration which is a measure of the residual life of the instrument, and the modified duration which is the first derivative to the respect of interest of the cash flow equation, hence it can be used as a first approximation of the change in value of financial instruments

o The duration GAP is estimated by computing the MD of Assets and liabilities, where the second one is multiply with the ratio of market value between liabilities and assets (financial leverage)

MD=M Da+MDl∗MLMA

o Limits of this approach are: The immunization will last a very short time, because the duration will change over time, the change of interest rate will change the DM, it is a linear approximation, it is based on the assumption of uniform changes in interest rate (basis risk), assuming parallel shift across maturities, and it is a costly strategy

o Some of those limitations can be overcame by some wise modification of the general approach: the basis risk can be solved by using a measure of sensitivity; the cost can be manage using derivatives instruments and the linear approximation can be reduce by using the convexity parameter (second derivatives, distribution of cash flow around the duration)

1 The usage of the GAP brings to different macro definitions:

Marginal GAP is the GAP computed for a specific window inside the horizon consider. It is related to the cumulative one by two relationship:o The sum of all the Marginal GAP is the cumulative one at time To The difference of two cumulative gap with different time horizon gives us the marginal GAP between the two horizon

Cumulative GAP is computed over the all-time horizon2 It aims to overcame the limit of the previously one, namely the over simplistic assumption of instantaneous change of interest rate for all the sensitive items3 The marginal GAP consists of aggregating items in time category on whom there will be computed the significant value and the maturity is simple the average of the beginning and ending date4 The underling hp is that those items respond to the change with some delay

2

Page 3: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 Cash-flow mapping (clumping) is the model that enable to consider asymmetrical shifts on the yield curve, furthermore it is the model

suggested by the Basel committee. The method consists of dividing all the instruments in each of its cash flow, and of converting all the instruments into zero coupon, on whom we will use the appropriate zero coupon rate [maturity]

o In set up this model there are several choices that should be undertaken: We need to estimate the vertices5 on the zero coupon curve, taking into account typical bank features: Change in short

term interest rate are greater than longer rate, volatility decrease as maturities increase and bank’s cash flow are more concentrate in the short term

We need to choose discrete interval since the “item by item” approach is too demanding, we are going to aggregate them using their residual life6 or using MD to take into account the different sensitivity to change in rate

o This method is suggested by Basel Committee to calculate a synthetic risk measure by applying this scheme: All the assets and liabilities are divided in 13 nodes applying the modified residual life For each node there is an associated risk coefficient, given by the product of DM and change in interest rate. It is

multiplied by the Net position for each node It doesn’t allow bank to completely off set gaps, to account for asymmetrical movements This measure is compared with the capital requirement

o The Basel methods has several limits: Use of Book value instead of market one as reference Presence of instruments which repay capital before maturity create a bias in the modified method residual life

approach [duration drift] Presence of instruments without a fixed renegotiation date or linked to market interest rate This method is not adequate to consider derivatives There is no compensation allowed for different currency Basel committee doesn’t provide a unique solution to those problems, actually it leaves the national supervisory

authority free to choose their modelso The clumping method provides a procedure to broken real cash flow in two virtual one in effort to better match the nodes

vertexes, so that to improve the cash flow mapping results, but require an in-depth knowledge of all cash flow, that is why it can be applied only to small portion of the bank’s portfolio

The virtual cash flow must guarantee constant portfolio (same MV) value and same risk to ensure same change against interest rate movements

o MV 1=MV 0∗(D0−D 2)/(D1−D2) MV 1 is the first node

o FV 1=MV 1∗(1+r1 )t1 where the rate and time is the node one

A variation of the previously method is basing the clumping on the price volatility to take into account the risk correlation. The two virtual cash flow must sum up to the real volatility

A proper Scheme to manage Internal Transfer Rate operation:

The ITR is a series of virtual transaction between branches and the risk center unit. It aims to centralize all the decision regarding the risk taking process, to better evaluate the profitability of each branch, to relive the branches from the need to care about the founding process and to transfer all the risk to a central unit.

There are two possible scheme to be used: one is the Single ITR, which is the more easier to implement, but it is criticized due to the unique (arbitrary, not related to market condition) rate which is used in the transactions, it handles only net flow, hence the branches are still affected by part of the risk; the other one is the Multiple ITR, which overcomes this limitation and allows to use multi interest rate for different maturities and gross flows, hence each operation is taken into consideration

Here we are going to provide some examples of peculiar transaction and how ITR dealt with: Fixed rate transaction, the risk unit take all the risk, the branch is ensured by blocking is financing rate Floating rate transaction, the same of above plus a premium that repay the higher risk

5 The nodes used are usually consistent with the maturity available for hedging instruments6 Note that the degree of risk doesn’t depend exclusively on residual life, hence we are simplifying the model in this case

3

Page 4: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 Not indexed at market rates poses two problems: there is no risk hedging instrument and there is a basis risk in fact if we use a derivative on

similar underlying there could be an increase in the spread between the similar asset and the real one. The bank can decide who will bear the basis risk since neither the treasury or the branch are able to handle it

To avoid arbitrage against Treasury, it could be possible to add a premium over the internal rates to give an incentive to choose the most favorable rate

Derivatives and special payoff, the option: the case of cap or floor or early repayment; price must be paid by the client with the spread charged or paid up-front. It can be beard by the branch or the treasury. The first possibility is simple using options

When we are building up an ITR system, we need to address all the desirable features, aiming to ensure a proper risk management system within our institution. Hence, we need to ensure that any Change of profitability is due to only of credit risk, and the total bank’s profitability shouldn’t change; we must protect our self from the interest rate risk and embedded options; we should use a multiple rate7 ITR and we should use gross flow methods8; we should use this system together with a cash mapping process

Credit risk

Credit risk represents the risk of the default of the counterpart (risk of insolvency) or deterioration in its creditworthiness (migration risk), hence it will arise whenever the discount rate of future cash flow doesn’t reflect the risk of the transaction. Besides to those main sources of risk there are other components: spread risk [whenever the market spread required for the same risk increase due to an increase in the risk aversion] and recovery risk, country risk and pre-settlement risk [when a forward contract is cancelled due to the insolvency of the counterpart and the bank is forced to replace it at unfavorable condition]. The positions accounted for determining the credit risk are the balance sheet items and the off balance one (OTC transaction)

The real source of risk is its unexpected component in fact if the predictable one will be incorporate into the interest rate spread/premium and so totally eliminated. This risk has two components Expected loss (EL) and the Unexpected loss (UL), our task is to estimate the first to properly price the instrument and to measure the second to raise capital to cover this position (Basel Requirement). Estimate this risk and its impact is not an easy task, in fact bank’s credit exposure are recorded at historical value and there is no secondary market to easily determine the market value of those position , hence we need to use internal asset pricing formula9.

Here there is a set of rules to be implemented to properly set up adequate governance standards: Establishing appropriate credit risk environment, meaning that the board must approve and review credit risk strategies, pointing out the

bank risk profile and profitability required. Senior managers must have the responsibility to adequately implement those strategies for all products

Use sound credit granting process: establishing credit limits, new client policy or amendment as well as allowing re-new credit line with particular care

Maintaining an appropriate credit control process Endure an adequate credit risk control, ensuring independency, IT instruments, recovery bad loan credit facilities and properly

prudential level in assessing risk Supervisory should check that those requirement are met and posing limits on risk exposure

The Expected Loss Estimates:

The EL is function of three different components [referring only to insolvency risk] that must be estimated:

The PD estimation:

PD is the probability of default, it can be defined by a more subjective or objective definition 10 and it can be assessed by backwarding [human base decision or automated algorithms] or by forwarding [expectation on future development, market data] looking methods. It is estimated by applying several models:

7 Close to those present in the market [even on the ask and bid side], facilitating the risk hedging8 Each transaction is considered one by one, no netting procedure ì, otherwise the branches will still retina some risk9 Note that to estimate default risk we can simply use book value, we need market one to estimate the effect of migration risk10 Depending on the decision the PD will be greater or lower, it depends on the criterion used

4

Page 5: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 Credit scoring models are statistical approach based on economic variables and financial indicators. They are used to forecast default and risk

level of the borrow one by one or by discrete grades (this procedure is better since it reduces the error) Linear discriminant analysis consists on classifying the data using different variables, to define a discriminatory value and to draw a

boundary line to clearly separate, as much as possible, the data between healthy and bad company. One of the possible variant is the Altman’s Z-score which is a multivariate regression that already suggests some key variables.

o The score can be seen as a “weighted average” of the scores of a set of different variables, where unimportant variables are weighted close to zero, while important ones get high weights and counterproductive variables get negative coefficients.

o Weights are chosen to discriminate as far as possible good firms’ scores from the bad firms’ ones, so mathematically we are aiming to maximize the distance between the two centroid11 with taking into account their variance [which is assumed to be the same over all the category, this hp can be ease]

o To test the significance of this model we can use a parameter called Wilks’ Lambda which basically will compered the distance between the two centroids. It is the ratio between the variance of the healthy and abnormal scores over the total variation (similar to R^2)

o From the score we can assess the PD by a formula based on the hp of normal distribution of independent variables. It depends on the score, on the cut-off level (which can be simple or more sophisticated one) and on the prior probability of

default [πa ¿.

PD= 11+π a/(1+πa)

(z ¿¿a−α)¿o The cut off between company is computed using the past observations and can be simple the average between the distance

of the centroids (simple rules) or to ensure a given PD for the accepted company we can use a given formula; the problem is that we can refuse good company that are in the grey/overlapping area.

There is a possible correction term to take into consideration the average quality of the sample or to take into consideration the cost of I and II type of errors. It aims to minimize the first one by adding a new term which considers as an information the ratio between the I and II types of error

o These approaches suffer of several limits: the assumption on normal distribution, there are variant with Heteroskedasticity, but they require lots of data and the so called “sample bias”

o To select the meaningful variables there are two possible approach: backward elimination or forward selection, keeping in mind the ratio of the model

Regression models require defining a sample size, independent variables and how to estimate coefficient. There exist several variants:o Simple which pose the problem of not limited dependent possible valueo Logit whish substitute the previously linear relationship with an exponential oneo Probit use a normal cumulative density function (smaller tail compared to logit)

Heuristic inductive modelso Neural, it tries to minimize human learning approach (black-box features12)o Genetic based on the survivorship completion (only the best are able to transfer to future generation their genetic)

General limits of this family of models are:o They don’t consider migration and qualitative issueso The meaningfulness of the variable used is questionable, since they must be industrial specific (it is not the case) and needs

big sample to avoid bias caused by the presence of too healthy companies in the sample

Capital market models are similar to a VAR approach

Use of spot or forward rate to estimate the default probability within one year and beyond. The key parameters are the spread, the PD, the recovery and the risk free rate. The PD parameter is computed using the corporate yield curve to maturity and the risk free one, we will use e^ form

o In case of maturity over one year we can use long term spot or forward spread, where d is the spread and k is the recovery rate

pT=1−e−dt

1−k spot case

11 It is the average value of historical company data for each sample category12 The procedure is obscure, and it is compared to a black box from where we gain result without knowing how it is working inside

5

Page 6: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163

pT=1−e−d

1−k forward case

o The spot will directly gave us the cumulative probability while the forward the marginal one, it is possible to obtain the cumulative with the forward by the formula

pT=1−(1− pt1)(1−pt 2)o While the marginal one is obtained with the spot by using the following formula:

pt1=1−(1−pT 2)/(1− pT 1)o Limits of this approach are:

It is applicable only for listed company, that have listed bonds for all the relevant maturities, and it is also suffers of the market illusion

It assumes a risk neutral approach13, hence we should consider/add a premium It relays on the expectation theory, however there exist liquidity premium

Merton model is an example of structural model, meaning it is a model that recognize default as a consequence of some intrinsic company’s features, differently form the reduced model which simply recognize it as a possible event and reduce the problem to estimate its likelihood

o Characteristics are: it is based on the idea that the shareholder position is similar to an option to default (when the value of the assets are lower than liability one)

It assumes that the value of the company follows a Brownian motion, with an increasing uncertainty At maturity it computes the frequency of the final result of the path simulated and check how many of them are below

the threshold This probability is function of the asset volatility, nominal value of debt, starting company value and the debt maturity

o The contingent claim analysis, since those risks can be hedge using a put option the investment can be seen as a risk free investment, where the put premium will be a proxy of the default probability. Thanks to this relationship we obtain the risk free default probability, the value of the debt and the interest rate required by the bank

o Analyzing the spreads and default probability computed by applying the Merton model we notice how the riskier company spread curve is negative sloped, while the safer one is reverse in behavior, this is a direct consequence of the survivorship bias for the riskier company14

o Limitations are: It assumed a unique zero coupon debt repayment It assume Brownian motion to describe the evolution of the price It doesn’t consider the case of early default, before maturity, meaning it doesn’t recognize the possibility that some

evolution path may overcome the threshold before time T It assumes constant risk free, that can be easily solved It doesn’t consider migration risk Many of the variables need to be estimated since are not observable It is an arbitrage free approach, hence there should be the possibility to trade those asset, but it is not the case

The KMV model is an attempt to overcome some of the Merton’s limits, namely the first problem and the estimation oneo Characteristic: it assesses the value of equity as equal to a call with same maturity of residual life of debt. Thanks to this idea it

can estimate the value of the company and the volatility, it use the price of the European call and the Ito’s formula transformation to find the equation to estimate the volatility (2 unknown 2 equation)

o The KMV approach takes two steps Risk index is computed using the innovative concept of distance of default. Thanks to this innovation they can

acknowledge the existence of short term and long term debt, hence the company may default only if the assets value drop below the short term debt

DP=STD+ 12∗LTD;DD=

V 0−DPV 0∗σ v

This index is then converted into a probability using empirical law o Benefits are the speediness to adapt to change in the financial condition, it is stable compered to economic cycle and it allows to

assign specific EDF

13 This model is assuming a risk neutral approach, meaning it starts with the equality between a risk free rate and the risky asset weighed by its survivorship probability14 Over time the pool of the weaker company will get stronger and stronger due the repayment of debt and the death of the worst among them

6

Page 7: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163o Limits are that it can be used only for traded company (it can be solved by using comps) and it relays on the efficiency of the

market

EAD Estimation:

EAD is the exposure at default, it is generally assumed to be a deterministic number, but it may be stochastic if the borrower has the right to change his size exposure. In this case we need to assess the drawn and un-drawn part 15. A Synonymous of EAD is recovery rate, it computed by applying different formula referring to gross, actual and loading duration

o RR= RAEAD

Gross method

o RR=PV ( RAEAD ) Actual method, using for each flow its maturity to the occurrence of the default

o RR= RAEAD (1−AD )∗(1+r )d

Loading duration, where “d” is the difference between RA duration and EAD duration

d EAD=∑ t∗PV (EAD t)/TOT_PV(EAD) where “t” is the time elapsed from default

Where RA is the recovered amount, AD the administrative and legal cost, r is the appropriate discounter factor and d is the duration of the process taking into account intermediate flow

LGD Estimation:

LGD is the expected loss rate in case of default, as PD it depends on the definition of default chosen by the bank, in fact the narrower the definition the lower the LGD. Its value depends on those drivers: Characteristic of the borrower: industry sector, country speediness of recovery procedure, financial ratio; Characteristic of the exposure: presence of collateral or guarantee, priority level; Characteristic of the bank facilities to recover loan and out of court settlements, and External factors, such as economic cycle [common with PD], interest rate level

There are different procedures to estimate it, however all of them have to take into consideration all the indirect cost beard by the bank to recovery the proceed and the time needed:

o Market LGD is similar to the PD estimate using market rate, in fact knowing the other parameters we can infer the related LGDo Work out LGD consists in building an historical database from where we could extract the LGD for a given type of borrower. This

method is the only applicable to bank’s loan which doesn’t have secondary market on which we can extract the market value after credit events

We need to find about an appropriate discount factor of the recovery amount, which must be forward looking since the procedure will be start in the future (after the beginning of the credit line)

We need to estimate the duration of the process itself, the book gives an example page 350. Note that this model is the preferred one

It is common in the risk management to find out some key variables which allow understanding the different recovery rate empirically experimented. This topic is important to understand the recovery risk, given the high volatility presented into database: It has been noted that the distribution is concentrated in the tails; The industry sector is a key element to explain differences; The priority level is important, but is not stable over time; and The presence of collateral is important

Recovery risk is the risk to achieve a different LGD from the expected one, it is usually quite sizable and fluctuate over time due to a binomial distribution concentrated in the tails

o This risk arise from the uncertainty of those variables: Amount to be recovered (EAD) Administrative cost to be beard Discount rate for future cash flow at future date The duration of all the process

o It is calculated using those two formulas

Assuming a non-stochastic LGD LGD∗√ (1−PD )

15 The un-drawn part needs to be estimate considering the portion that will be drawn at default, to ease this task bank requires fees on this part that allows a more rational pricing (common in Anglo-Saxon countries)

7

Page 8: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 Assuming a stochastic LGD √PD (1−PD ) (LGD )2+PD+σ LGD

2

The link between LGD and PD must be taken into account, since they share common systematic factors. This relationship is negative, as empirical evidence based on junk bond had proven. Doesn’t consider this risk may lead to an underestimation of risk. It is affected by: Chain effects: economic downturn may reduce the value of assets; Financial asset and interest rates together with real estate one, and Industrial specific: the inventory may lose more value for certain industry

The measure of the Unexpected Loss:

The UL can be simple define by the volatility around its mean (EL) or by some sort of CREDIT model based on VAR approach. This kind of exposure can be effectively reduced by setting up a diversification policy.

We need to choose appropriate time horizon and confidence level: Time horizon is arbitrary chosen equal to 1 y, because both subjective and objective criterions are not applicable due to the presence of illiquid market, of contract without explicit maturity and so on. However this decision has a great operational fit: The bank use yearly budget, hence having yearly measure ease the budgeting work out; Usually 1 y it the correct time elapse to raise new capital; The turnover of the asset is usually close to one year and Furthermore some model will account for long term risk including migration risk in their computation. Confidence level may consider that the data cannot be explained by a zero mean normal distribution, since the mean is below zero and data are strongly asymmetrical. Hence the choice of Confidence level need to be credible, because it is less stable and change a lot at different level

Tools used to estimate UL:

It is estimated using portfolio models that apply different definition of default (including or not migration) and level of loan correlation (explicitly or implicitly modeled) as well as EL

Creditmetrics is a multinomial approach it considers all borrowers’ change in rating as a credit eventso The model basically consists on deriving the empirical distribution for any rating class movements, and by using the respective

spread for each maturity to compute the expected value of the assets, and on doing the difference between this value and the face value to compute the expected loss

We should compute the VAR percentile using the probability for each possible movement in the rating level to have a measure of UL, since the distribution is not normal ( we cannot use the normal standard deviation time percentile approach)

o To overcome the Wilson’s critique regarding the accountability of the economic cycle it has been introduced the credit portfolio view which is taking into consideration the economic variables

Note that if the time horizon used is a point in time there is no need to adjust by economic cycle, in this case there would be a double counting

o To use the model for portfolio of assets we need to estimate the correlation, by using those steps: A modified version of the Merton model, where we find out in the normal distribution, all the thresholds for default

and change in rating. Using math integral we can compute all the probability of change marginal and cumulative for each assets

We pool together all these info with a bivariate normal (given the correlation) and the new distribution will

give us all the joint probability ∫∫ 12π √ (1−p2 )

e(− x2−2ρxy+ y2

2(1−ρ2 ) )

Asset correlation, which is the one used in the precedent formula is computed using large building blocks o Correlations are first estimated among a large set of industries and countries (“risk factors”), For

each borrowers, a set of weights must be specified, expressing their sensitivity to different risk factors and to idiosyncratic risk

o Combining those weights and the risk factor correlations, an estimate of the pairwise correlation of

two firms can be obtained ρa , b=β1 β2 ρ1,2+…+βn−1 βn ρn−1 ,n

It is clever, because Asset value changes – unlike the distribution of losses – can be described by using normal distribution, it can then be described through a double normal distribution with correlation parameter ρ, hence from this distribution, the joint probabilities of the values of the two loans can be inferred, taking into account the transfer of risk

8

Page 9: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163o Benefits are:

Uses objective and forward looking market data Interest rate curves and stock indices correlations Evaluates the portfolio market value Takes into account migration risk

o Limits are: Needs a lot of data: forward rates, transition matrices Assumes the bank is price-taker Assumes stable transition matrices Proxies correlations with stock indices Maps counterparties to industries and countries in an arbitrary and discretionary way

Credit risk + is based on an insurance approach using Poisson distribution to estimate probability of defaulto The idea behind this model is the equivalency of the bank and insurance risk, the major difference is the correlation among

bank’s clients. It aims to assess the risk portfolio, it doesn’t give us PD, its input are: PD and its volatilityo We use a banding method to link default probability with losses. It consists on creating category of similar expected losses and

computes the number of default for each band, each of those will be used in the Poisson distribution, It is advisable to use, to sum up, the default number weighted with the ratio of j-esimo amount and band expected loss

The creation of category is made by dividing all exposures “Li” by L and round them up, getting standardized values “vi”. The recovery rate is use to determine the exposure (used in the banking) in a deterministic way

Each of those Poisson distribution, one for each bands, must be aggregate. However this first model assumes independency between variables, but this can be overcame by assuming that the “number of default” for each band is random (see below)

o The Poisson distribution assumed independent value and it works only for small PD. To estimate correlation we will run “n” simulation within all the bands [assuming that the PD are stochastic] and then we made the weighted average conditional to all the scenario occurrence and this new distribution will be the unconditional distribution, accounting for the asset correlation

o Benefits are: PDs and exposures (book value) net of recovery are enough. The “correlated” version requires also sensitivities to the

economic cycle factors An analytical solution exists, hence it is fast to be implemented Possibility to obtain the distribution of losses without recurring to simulation techniques

o Limits are: Only looks at default risk, so it Does not consider migration risk Assumes constant exposures and does not consider recovery risk It isn’t a dynamic model. Meaning it cannot be used for changing portfolio without recomputed all the calculation

Comments on the UL tools employed:

Some comments on the major characteristic of those models: Default-mode versus multinomial one, where only credit risk + belongs to the first Future values vs. loss rate, meaning that the model can be based on the distribution of possible value or possible future loss, the first use

as input the spread curve by maturity, while in the second the spread it is not necessary to be known. Creditmetrics is a typical market value model, while credit risk + loss one

Conditional vs. un-Conditional, portfolio views belong to the first, however this distinction is useful only if the model works through the cycle

Monte Carlo vs analytical solution Asset correlation vs. default correlation, it’s less important than other, in fact they are close to each other. Creditmetrics is belongs to

asset correlation, while credit + to the second one.

Some major limits: Treatment of the recovery risk is not random (besides for credit +) and independent from PD Assumption of independence between exposure risk (usually treated as known) and default, while they are empirical positive related

hence this assumption brings to an underestimation Assumption of independence between credit risk and market risk Impossibility to back testing due to yearly frequency, there is not enough data sample

9

Page 10: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163Application and General comments on Credit Risk methods:

Loan pricing by assessing EL and UL using a transparent process to properly consider the cost of capital absorbed. There is the problem to attribute the marginal benefits

o d EL=ELR∗1+r1−ELR

ELR is the expected loss rate, it is the spread to compensate for the EL

o dUL=VaR∗r eq−r1−ELR

VaR is a relative measure, req is the cost of equity

Risk adjusted performance measurement is used to decide to undergo a specific investment

o RAROC=r¿ (1−ELR )−r−ELRVaR

where r¿is the lending rate applied, this number will be compared with the bank’s

RAROC Setting limits on risk taking of the different business units, however it is crucial to properly define the appropriate level of aggregation for

units and the VAR limits and frequency with which are checked as well as its involvement in the budgeting processo MaxAmount=EL /ELR this formula will give the max loan amount that can be granted

Optimizing the portfolio composition, this is limited due to bank’s loan characteristics (geo, no secondary market and limited rotation). However thank to recent derivatives development and secondary market it is now possible, the best solution should be to divide the Risk management optimization form the origination process

10

Page 11: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163

Market risk

Market risk is usually identified as the risk inherent in the trading book (short term), but it should be extended even for those investments intended to be kept in the financial statement for longer period. Nowadays, it has gained more importance due to the new accounting principle, the securitization process and the growth of derivatives instruments. The key elements of this type of risk are: Exchange risk; Interest risk (different from the previously one because it is related only to securities which have a secondary market and it will affect only a limited part of the balance sheet); Equity risk; Volatility risk and Commodity risk

The traditional approaches were: Nominal value which considers the risk as proportional to the nominal value. It has severe limitations such as: the nominal value doesn’t reflect MV, it doesn’t capture the different degree of sensitivity to a change in the risk factor and it doesn’t consider volatility and correlation; Sensitivity analysis is based on the usage of coefficient representing the risk for each securities category. However there still drawback such as: you cannot aggregate those coefficients, it is an easy to communicate between division and senior since each position has a unique measurement and it doesn’t consider the volatility and correlation (as nominal value method), meaning that without taking into consideration the volatility of risk factor is basically not considering the real risk of the position

Tools used to estimate market risk

VAR models are characterized by: confidence level, maximum potential loss that a portfolio can suffer and a certain time horizon. This measure is comparable between all securities class. These models aim to define the risk factors, the probability distribution of those risk and to summarize those information in one risk parameter The easiest one is the parametric one: normal distribution of change in value in market factors, all possible information are summarized by Var-

Cov matrix , the possible loss are correlated to the risk factor by a linear function and VAR is simply a multiple of standard deviation. o The most crucial hp are the one that ignore those problems:

Empirical data shows that the distribution of risk factor is skewed and with fatter tails It works by simply reverting the normal distribution into density function from where we can compute the equivalent

percentile. [MV*σ*α*β] The linear relationship is represented by the sensitivity coefficient which (for example) is the modified duration for bonds;

this is a delta normal approach. Alternatively we can assumed that the prices are log normal distributed, hence there is a one to one correspondence, this is an asset-normal approach

o Typical issues are: The confidence level choice represents the degree of risk aversion of the financial institution, or the target capital

requirement that allows to have on balance sheet the investors’ expected creditworthiness (investment on it) , in fact there is a positive empirical relationship between those value

Time horizon is usually a short term measure (daily), the choice of the appropriate level is crucial since the higher the time length the higher will be the VaR value

The bank must take into account the liquidity level of its position, the size and a subjective idea on the time that the instrument will be in the trading book

It is important to have enough data observation to ensure significance on the value. Hence for longer time horizon there could be problems to achieve this. Sometimes to overcome the problem it is possible to use the LRW hp to convert daily data into monthly or weekly

o When we apply this method to portfolio we need to compute the correlation between asset, which is simply the matrix of Corr pre and post multiplied by the ordered securities VAR. This formula will grant diversification benefits, but will arise sub-additivity property

o When we want to use multi risk factors we need to break down each security into elementary components which depend solely on one risk factor, and then aggregate them as a portfolio. The approach suggested is the mapping of risk position:

In the case of a foreign currency bond we have two risk components, we will proceed by computing each single VAR and then sum them up using their correlation

Forward currency position : three positions, exchange rate spot and a spot investment/borrowing amount Forward rate agreements consists of two components a debt spot position with a maturity prior to the investment spot one Stock positions we can consider each stock as a risk factor or use fictitious position toward relevant stock indexes, hence the

position is mapped to the relevant betas. It works only for well diversified portfolio without specific components Bonds are instead mapped for each of their cash flow (clumping)

o Limitations of this model are [note that this model is better if considering single risk profile]:

11

Page 12: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 Risk factors change are assumed to be normal distribute Since the assumption of risk factor is to be explained by Brownian motion the volatility is assumed to be stable over time

and serial independence across risk factor, it is not empirically true The relationship with loss is assumed to be linear

o Solutions provided are: The possible solution to normal distribution hp is to use a mixture of Gaussian distribution (to solve the fait tale problem

not the empirical asymmetry) or t-student The non-linearity can be solved by using a curve parameter, but this solution poses the problem of not normal distribution

of the underling changes since the curving coefficient is a chi-Square function. Furthermore by introducing a new parameter we are increasing the model error, and, above all, there is the underling hp that the payoff are derivable, otherwise it is un-applicable. This approach is less effective for joint shock. Another solution is using full valuation approach

Simulation model allows to use different hp compared with the previously one: the risk factors may have different distributions, since the impact of each risk factor change is value by full valuation it is possible to have a different relationship between loss and risk factor and it is a way to ensure flexibility. Hence, This model are best used in case of nonlinear payoffs and to value extreme event [stress test]

o Full valuation is a way of estimating price variation is not a simulation model, it consists on re-price the securities applying a price formula (that must be define a priori)

o All distribution are empirical, meaning based on the simulation. Here there is the difference between historical and Monte Carlo where the fist use empirical distribution obtained by the observation while the second define a parametric distribution

o Use of the Percentile logic, meaning that we will generate scenario by applying a given distribution and on those scenario ordered we will compute the interesting percentile, hence the VaR will be the difference between the asset current value and its percentile value

o There is great flexibility on defining the market risk change behavior Historical simulation transforms the historical data into future possible behavior, hence it uses the past distribution to predict future one

without changeo Merits of this model are:

It is simple to be understand and communicable It doesn’t require hp about distribution or correlation (not Var-Cov required) Non linearity required between risk factor and price change

o Limits and possible solution are: It assumes a stable and stationarity distribution given by past data It may not applicable if the time series are limited or if there is the problem of the usual tradeoff between time length and

meaningfulness, however using a hybrid approach putting together the exponential weights and the simulation approach we can use longer sample ensuring meaningfulness. Furthermore this hybrid approach allows the model not to have stable distribution, note that is an approximation, since the current data have higher weights any past change in behavior is smoothed

o The methodology of bootstrapping and path generation are used to obtain bigger sample size Bootstrapping is made to avoid loss of observation when we move from daily to weekly or monthly frequencies. It consist of

an extraction (allowing for reinterring) of observations, in this way we will built X paths (useful in case of exotic option pricing when we need to know the path is important in defining the price) from whom we will extract the distribution. The underling hp is that observation must be i.i.d

To try to overcame the hp of i.i.d we can used something similar to the hybrid approach or there are two better proposals: Hull and White suggest adjusting data by weighting with current volatility, meaning by proportional move returns

value. Heteroskedasticity can be incorporate by rescaling the time series with the available information at time T Filtered historical simulation is based on G-ARCH models which are used to filter data and to obtain residuals

which are used to create scenario. Each residual is standardized (hoping that doing so they are i.i.d) and used in a path generation process where the first data is the filtered return, not the sample one, and it is multiply with he estimated conditional value of next period (predicted volatility by G-ARCH times previously residual)

Monte Carlo simulation is based on the definition of an a priori parametric distribution, which should be consistent with the future data behavior. This model is more computing efficient since by increasing the parameters involved there is a proportional increase of variables number, but still demanding

o Differently from historical simulation, it requires to compute or to assess the correlation between risk factor a priori (otherwise is like assuming independency), but it allows to know the path evolution

12

Page 13: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163o To take into account the relationship we need a Corr matrix which is decomposed into two triangular matrices and used them to build

up scenarioso Limits are the needs of an asset based estimation of Var-Cov to find out the joint distribution

Stress test aims to estimate the effects connected with extreme events, meaning that the simulation is carry out following predominantly arbitrary and subjective manner just for few scenario

o Use past shock and simulate themo Factor push analysis meaning use more standard deviation movements, note that there is the problem of great variation of

insignificant risk factor are not significanto Derivatives policy group suggestion it is uni-dimensional (meaning each variable at time) process suggested on interest rate

movement in the slope or shift as well as on other macro indicatorso Multidimensional scenario can implement into two ways:

Simple where only some risk factor are let change while the others are kept constant Predictive is the same of the previously one, but the others are changed according with their correlation to the moving one

o This instrument should be a complement to the previously ones and must be followed up by practical actions to reduce risk, hence vulnerability. It allows to test liquidity risk

Different applications of the Market risk model :

Unique risk measure for both horizontal and vertical communication between division Portfolio analysis is possible: there are several method to aggregate risk factors It is useful to determine the risk limit exposure both as nominal value, market value exposure (remembering that those measures will be affect

by change in volatility) and maximum tolerable variation. It can be used to as a mean to risk adjust return by using RAROC [both ex-ante and ex-post], which is a profit/risk ratio, thanks to this

instrument it is possible to monitor the units return, properly set incentive procedure

13

Page 14: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163

Measure of Volatility

Volatility estimation can be divided into two groups: historical with constant parameters and those with changing one and implied option. It is important in the Var-Cov model, since it is a crucial parameter. For each of the subsequent model we need to define the time horizon/frequency The simplest is basically the computation of the Var-Cov matrix using an equally weighed average or an exponential one “Riskmetrix” using as

data source historical dataseto Simple moving average model poses two problems: The sample size: the longer the higher the information, but it will be less reliable

for prediction purpose, since it doesn’t reflect current situation ;and Echo effect meaning that any shock will strongly affect the value both in the entrance and in the exit from the sample

o Exponential moving average will overcame the previously problems, but it poses the problem of choosing the appropriate decay factor (which should depend on the data behavior, i.e. how fast they react to change, but it must be noted that it changes overtime) and the number of past observations must be evaluated, this problem is solved increasing the frequency so that to maximize the info content and minimize the error sampling

G-ARCH model stands for: Heteroskedasticity, Conditional and Autoregressive. It uses the maximum likelihood criterion, hence it use market data to get a better estimate of the decay factor, and however it needs lots of data point.

The two common factors are: Past volatility it indicates the rate of persistency. It is above 0,7 Past prediction error square it indicates the rapidity with which volatility adapt to new market shock. It lower

than the first one Benefits of this methodology are:

It recognizes the serial correlation It gives adequately importance to new information The decay factor is directly determinate by market data

o It use a normal distribution to describe the behavior of the prediction error, hence it is a poor method in case of skewness or leptokurtosis. The later problem is overcame by using t-student distribution while the first one (which come out of the fact that this model gives same importance to the sign of shocks, which is against empirical date) is solved by models that recognize this asymmetry, i.e. the negative effect has an higher impact:

IG-ARCH require that the sum of the coefficient is one, hence if the constant is set to zero is basically the exponential moving average formula

EG-ARCH models the natural log of the variance (allowing the equation to give negative output), so that instead of square value it use absolute value and real value (response to good and bad is consider)

AG-ARCH use the square, but it centered the data using a parameter which will amplify the negative effect an reduce the positive one

o It gives good measure for the immediately following period, but it is less informative for subsequent, in fact, since the parameters should guarantee mean reversing or better a converge series, it converge to the long term value

Implied volatility is basically the volatility taken out form derivatives instruments, hence it represent the market expectation on that period, not really used in risk management

o It may be affected by counterparty risk or by the liquidity levelo It will change depending on the contract chosen (at/in/out the money), the maturity and the model used to price the option (it must

be reliable and the instrument must be liquid to ensure efficiency)o The maturity and the time horizon for risk management system should coincide to ensure consistencyo Computing covariance value is complex and we may not have data, moreover we need to check if the numbers are consistent

between them, the matrix must be at least semi-positive. There are two way: the first by using derivatives with more than one underling such as quanto option, or using page 182

The general limits of the VAR approach are:

It doesn’t define the behavior in case of extreme events higher than the thresholds (however it recognizes them); however the purpose of the VAR isn’t to make the bank completely safe from bankruptcy, but to reduce this risk to an acceptable level. Remember that higher capital requirement will lower the net income

o However VAR fails to gives us a measure (probability) that allows to discriminate within excess lost between portfolio, it isn’t possible to understand which is higher

14

Page 15: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 The VAR won’t take into consideration the customer’s relationship or any other qualitative indicator, however we can post that the bank is not

force to use VAR as the only method to allow credit line, it may be the case that the management use other qualitative methods Some of the assumption behind VAR models are questionable, however if they are understood they can be overcame, furthermore VAR it is a

good tool to allocate resource between units according to risk hence it gives more insight compared with other tools Different VAR approach can lead to different value, hence it can be seen as a weakness, however it must be noted that since the results highly

depends the model there could be the cause that the underling hp of some of those approaches are not appropriate. Furthermore if there will be any bias it is uniform for all the securities, hence the relative risk allocation or perception is unchanged

VAR is a too cyclical measure (it is different for each method), however the traders’ behavior follow this trend during financial crisis VAR comes too late, meaning it is not able to predict crashes, however VAR purpose isn’t to predict crashes (which is impossible for any

historical predictive method), but to generate consistent and uniform risk measure during normal conditions for day by day business regulation VAR has a sub additivity property, diversification allows to reduce aggregate VAR, however it is not always the case (such as for non-parametric

models), hence it may strongly underestimate risk. o This last problem can be solved by a different measure proceed: Expected shortfall: This method allows to have an idea of the

possibility to have excess lost compared with the VAR lost, it is the average of the excess lost beyond the VAR thresholds. It ensure sub additivity and it gives us a measure of how much money are needed to bail out the bank for the supervisory authority or the expected payment that an insurer have to pay against bearing this risk

Using the Backtesting methodology we will be able to check the performance among different VaR models and to check the consistency with confidence level. The method is basically counting the exceptions to test independence among them and to test the confidence of interval: there are two test proposed:

o The Kupiec Test: the null hp is that the model is good performing We will compute the LR test between two binomial distribution, the numerator without conditioning (hence using the

theoretical percentile implemented), and the denominator conditional to the empirical observation. The test is distributed according to a chi-square with one degree of freedom

LR=(Nx )¿ px∗(1−p)(N−x) conditional coverage

Limits of the test are: it doesn’t account for serial dependence (meaning it doesn’t allow to value a model for its capability to avoid time concentration of excess losses, hence it doesn’t value the quality of the model to react to change in the market conditions); it requires lots of data, the power of the test is weak (and it decrease with the confidence level and number of data)

o The Kristofferson Test: the test is willing to check that all the probability of exception and non-exception occurrence is not correlated to the previous occurrence

The LR test: it is the ratio between the Likelihood function assuming independency and the perfect correlation, it is possible to joint test this LR test with the Kupiec one by summing up them, this new test will be distributed according to a chi-square with two degree of freedom

o The paper of Saita and Sironi provides an empirical analysis using 7 equally weighed portfolio (equity, global) to test a Garch(1,1) model a Historical simulation test and an EW Moving average (lambda =0.94) at three confidence level

All the three models have failed to grant a consistent confidence level at 0,99 and 0,97; while at 0,95 the EWMA has been more conservative and the other two still underestimate

The Kupiec test has well performed (only at 0,95) for some cases three for EWMA, one HS an two Garch The extreme event have shown a higher vale than the theoretical one The Kristofferson test has failed for basically all the cases, it has performed better at 0,99 Internal model based on those models have been more capital demanding the HS is the most conservative due to the abrupt

change The analysis wasn’t able to define a winner, it was limited only for equity with daily horizon

15

Page 16: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163

Basel Committee Framework:

This agreement is the result of the meeting of the G10 central bank governors to set up a common banking standard to ensure solvency of the industry and avoid unfair completion based on different capital requirement by national supervisory authority, hence it aims to uniform the bank capital requirement (it applies to consolidated account, hence it would foster the soundness of institution controlled by foreign banking group) and to prevent future bank crisis (it would revert the bank trend to reduce capital). It is an ongoing work trying to achieve a best practice to ensure a global stability to all the financial institution.

Basel I:

Capital requirement computation is focus on the presence of three different tiers, which need to sum up to 8% i.e. the minimum requirement bound. This measure is net of goodwill and investment in non-consolidated bank or financial institution

o Tier 1 is the most important and valued part of the capital. It consists of an upper tier: shares paid-up, disclosed reserves and certain general provision (in UE the use of those items has been limited only for specific risk, and they have to respect: after tax, immediately available and size, distribution and provisions must be shown separately) and the lower one: innovative instrument which respect: un-redeemable, permanent, callable by issuer’s will (after 5 y), must be junior to all other instrument, absorb loss without liquidation procedure and the remuneration can be deferred or if impossible forfeited

o Tier 2 is the first category of supplementary capital, it can be used at most for 50% of total capital. It consists on: Undisclosed reserves meaning all those reserves which are created using revenue from off-balance sheet, however

they must be solid as the disclosed one Revaluation reserve (sizable in German and Japan), they are the difference between historical and current market

value (they must come from a revaluation process) and can be used at 45% of their value (due to possible negative variation)

General provision, however they have been limited by the new accounting principle and must account for at most 1,25% of the total capital

Hybrid capital instruments don’t need to put the bank under liquidation to absorb loss, remuneration can be waived or reduce and cannot be redeeming by credit (only with supervisory authorization). It must unsecured (as all this instruments) and fully paid up

Subordinated term debt, the main difference with the previously one is that to be liable of any loss there must be a liquidation procedure. The condition that those instruments have to meet are: maturity longer than 5 y, they must be “depreciated” at 20% each year and must be junior

o Tier 3 it cannot be accounted for more than 250% of tier 1 for market risk and it cannot exceed 50% of total capital requirement. It includes short term subordinated16 debt and can be used to cover only market risk.

Risk weights in the first Basel agreement are quite simple and “linear”. They are:o 0% for cash, Government bond in the OCSE, claims on central banko 20% claim on central bank or country outside OCSE, claims on bank less than 1 y and multilateral development bankso 50% loan secured by mortgage on residential o 100% the otherso The OTC items pag. 555

The major Limits are:o Focus on credit risk only, there is no consideration for currency risk or others (1996 solved)o Poor differentiation of risk, there is no consideration for this issue on the weights, furthermore the class defined by the weight

allows for regulatory arbitrage, they aggregate too mucho Limited recognition of the link between maturity and credit risk as well risk mitigation instrument

Basel committee II

Pillar 1 capital requirement has receipt all the Basel 1 criticism and it had tried to improve the overall capital system. It must be noted that the capital requirements are not fix, but can vary following the supervisory expectation both on the capital and the risk perceived by internal rating method (by multiplying them by a scalar factor)

o Risk weights for the normal one are related to the rating agency with some improvement, acknowledging the

16 It must be at least 2 y, cannot be redeem, there is a lock-in if the capital ratio follow below the minimum plus 20%, and it must be reduce by loan loss forecast and security for trading one

16

Page 17: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163o Internal rate is the possibility given to certain bank to use internal procedure to identify and estimate the main component of

risk, there exist two different versions: Fundamental allows only to make estimation on PD, while the other EAD, LGD and maturity are chosen by the national

authority Advanced gave the rights to use own estimations for all the risk components. It can be used if those condition are met:

7 rating classes, where the first must have a PD of 0,03% and there must be a default category Pillar 2 has been made to enforce the supervisory capability to control the bank system. This approach ensure a proactive prudential

behavior based on those principleso Bank must set up a system of process and techniques aimed at establishing the overall capital adequacyo Supervisory authority must check those process and ensure the respect of the minimum capital requirement and promptly

intervene to avoid capital deterioration Pillar 3 has been done to reduce the opacity and to force the market to discipline unfair behavior and to promptly penalize bank which are

taking more risk. The bank is so forced to publish info regarding: their economic and financial results, financial structure, and their risk management strategies, exposure to risks, accounting policy and corporate governance.

17

Page 18: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163

Insurance business

Solvency II

It will take 8 years to totally implement the new framework started in 2005. There have been 5 insurance conferences (QSI) which has seen a wide participation. The main features are:

This framework aims to enhance a more consistent standard across EU and ensure that capital requirements are more reflective of risks undertaken by insurers

o Market consistency approach for valuing all assets and explicit pro-cyclical provisions Asset and liabilities [non insurance one] should be valued at price “arm’s length transaction” with reference to IFRS

and no recognition of change in credit standing for financial liabilities Asset and liabilities [insurance one] should be value with model (best estimate)

Price needs to be the same irrespective of investment strategy Use of risk free rate adding a risk margin The pricing model consists on computing technical provisions based on

o Best estimate: PV of future cash flow, which are computed considering prudent and realistic assumption, management and consumer possible future actions

o There is any more prudential provision (forbidden), all the guarantee, discretionary benefits and option must be valued

o Reinsurance revocable must be valued separately and with allowance of credit risko The choice of which risk free rate poses some problems [EIOPA provides some guideline] such as:

Different currencies problem is solved using swap rate adjusted for credit risk and providing by QIS5 spot rate curve for main currency

Different maturities need appropriate rate: Illiquidity asset: duration cap at 15 euro and 30 GPB and linearly reduce to 0 in

5 years and rate adjusted The spot curve is calculated using a certain basket of corporate bonds The liquidity premium is cap at the max available on the market for similar cash

flow without risk, and it should account for liabilities’ nature (it is provided by EU institution with the same frequency of interest rate)

The risk free rate are extrapolate using macroeconomic model to find out the unconditional long term rate

o Better recognition of diversification, risk mitigation and loss absorbing itemso New supervisory approach, more proactive and EU coordinated among country and groups’ economic reality

Group complementary supervisory which have primary responsibility (can used internal model) the methods are Accounting consolidation (to eliminate double counting) Deduction and aggregation recognition of diversification

Third countries relationship is articulated in three aspect The reinsurance supervision from equivalent third countries no difference Group capital requirement from equivalent third countries can be used their Group solvency from equivalent third countries, their authority is ok Transitional arrangements are put in place for important countries, that are not equivalent to EU regulation

The reform aims to promote confidence and transparency among insurance It equally apply to reinsurers It wants to overcame the Solvency I limits:

o Lack of harmonizationo Inconsistency with new IFRS accounting principle o Capital requirement is not transparent and not adequate to risks, furthermore it was focus on back-looking aspect instead of

governance issue (good Risk management)o No recognition of economic reality of group (only plus requirement any reduction)

18

Page 19: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163

The formation process and directive structure:

The formation process and authority involvement is basically EU lead by applying the Lamfalussy structure: L1 see the involvement of Parliament, Ecofin e commission helped by special committee and L2 and L3 see the “joint committee” [ESMA, EIOPC, EBA] check law. The directive structure reflects Basel II/III’s pillar approach

Pillar I risk calibration of financial requirement, hence it is focus on the quantitative issue o The SCR is calculated in terms of potential loss of value at a confidence level of 99,5% over a 1 year time horizon considering all

quantifiable risks It should (internal model) account for even the liquidity premium risk Each risk is model individually with a modal approach using factor or scenario approach The individually risks are aggregate using corr matrix (provided by QIS5) It must be taken into account the absorbing capability of technical provisions (meaning future benefits: change in hp)

and deferred tax Gross SCR are computed without considering benefits Net SC assumes bonuses to reduce/absorb loss The difference of the first two items will be FDB (future discretionary bonuses)

Mitigation techniques and collateral and segregated assets are allowed under certain conditions (legally binding, actual transfer of risk…)

Proportional recognition for techniques in force for less than 12 months are allowed under certain conditions( no risk for rollover and counterparty risk take into account)

o The MCR (minimum) is cap within the corridor of 25/45% of SCR Legal certainty, auditable and safety net protection It is based on percentages applied to combination of premium and technical provision

o Solvency II allow to choose which level of simplicity or sensitivity can be used to assess risk form the simplified method to internal method

The insurance must grants a sound framework to managing, controlling and measuring risk The authority can still request specific parameters for internal models

o The Own funds categorization: Basic Own funds: excess asset over liabilities, subordinated liabilities and adjustment (expected profits, net deferred

tax and restricted reserves) They can be used for all tier capital requirement

Ancillary Own fund (prior supervisory approval): off balance items that can be called up to absorb losses such as unpaid shares, letter of credit other legally binding documents

It can be used to tier 2 and 3o The capital eligible to meet SCR must be at least 50% tier 1, hybrid instrument can be at most 20% of tier 1 and tier 3 cannot

account for more than 15%o The capital eligible to meet MCR must be 80% tier 1 no tier 3 available and no ancillary fundso The criteria used to allocate OF in tiers are: subordination, loss absorbency, duration, freedom to redeem, absence of

encumbrances and of mandatory servicing costs Pillar II new supervisory relationship and setting up for governance

o Authorities are granted to take action to restore critical situation when SCR get closer to MCR, to ensure even in extreme situation to preserve policyholders interest

There is a convergence of supervisory standards Capital “add on-“ are authority demand to adjust risk assessing They can draft and implement measures, so that they can promote fairness act against breach of EU law and act in the

emergency situationso To take into account cyclical effects the authority can extend the recovery period, inherent rebalancing between SCR and

available capital, liquidity premium and equity symmetric dampener o There are quantitative risk management standards, so that it will play a central role in the company

Internal control system and internal audit and actuarial functions

19

Page 20: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 ORSA (own risk solvency assessment) specific risk profile, compliance with financial requirements, significance of

deviation between the twoo There are new disclosure requirements bring market discipline to bear insurers

Pillar III opening up to market discipline provisions and reporting to authority o The disclosure to market must be at least annually and account for business performance, risk profile, system of governance,

capital management and valuation purposeso The Regular supervisor report to must be narrative with quantitative data, every 3 years but it can be asked to be provided

annuallyo Annual and quarterly quantitative template with a lag of 14 weeks and 5 weeks for the last one

The main implementing problems and open issues are: Liquidity premium, currency curve and how to allocate them to liabilities, There are too many uncertainty and different interpretations, The QSI5 have recognize 92% of OF as tier 1, The group diversification has a low to free 20% of OF on average which has created a huge difference with other non EU entities, Eliminating excessive complexity, reducing excessive volatility and reducing penalization for long term business with guarantees (revising stresses for interest and spread risk at long duration); Improving calculation criteria contract boundaries and deferred tax and liabilities; Finding political compromises on political aspects illiquidity premium, future profits, diversification in risk margin and at group level; Designing appropriate transitional provision disclosure and reporting, internal model; Properly manage the new task of supervisory authority (new tools, culture and power); and The impact of the new discipline will be huge, there is a change on the overall business with extensive impact on governance, strategy, expectation of authority and new competitive scenario

Some example:

Example of the usage of the aggregation of modules for SCR calculation purpose based on the market risk case, which is divided into several categories

o Market investment must have a minimum rating of BBB as an overall ruleo The equity risk module

Speculative are treated depending on the market on where there are listed and on CEIOPS Advice Duration based approach (22%) for certain lines of business Symmetric adjustments are made by EIOPA on a benchmark of listed company and are sufficiently public. The

adjustment is cap between -/+ 10% and it is computed using a formula that take in consideration the current level and the weighted average of daily data over a window of 36 month

Strategic participations are stressed at 22%o The interest rate module

All asset sensible to interest should be stressed, as well as insurance liabilities in their PV change There is no stress test on volatility of interest rates

o Spread risk module consists in any change in the credit spread over the risk free rate It is calculated on separate items (Bonds, Credit derivatives and ABS on loan)

Derivatives which are used for risk mitigation are excluded Bonds and ABS are value using a table using rating, a conversion factor (larger for the last)

Instruments of EU with AAA or AA rating are excluded, and the other have lower capital requirement as well as corporate AAA rating

o Currency risk module It can be calculated on net or gross position It consider a max variation of 25% No diversification benefits, but the EU currencies are advantaged with lower stress level

o Property risk module, it the risk related to real estate 25% for all property (direct) investment in company in real estate is subject to equity module No geographical diversification

o Concentration risk module, it aims to consider the risk related to lack of sufficient diversification or to exposure to large default risk (single issuer)

It apply to all the category of assets, considering only the exposures in excess of certain threshold It depends on the exposure and rating of the counterpart It is excluded participation internal the group, when the risk is beard by policyholders or when it is already considered

in the default counterparty module

20

Page 21: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163 It doesn’t consider geographical concentration EU with AAA and AA no capital requirement

o Liquidity risk module is new and it allows for change in illiquidity premium Negative correlation with spread risk (-50%)

The counterparty default risk module considers the possibility of unexpected loss or deterioration of the credit standing included in the spread risk module

o I type: undiversified and rated exposure Collateral reduce the risk (adjusting for risk) the LGD and PD are given

o II type: diversified and unrated exposure (policyholder debtors, mortgage loan)o The correlation granted between this two types is 75%

Difference between Banks and Insurer

The relationship between insurance and banks is getting closer and closer thanks to: New fashionable strategy implemented to manage the reserve taken form the Asset management industry17; Bigger cross-selling activity thanks to innovative products sold by banks and insurance (Life segment) or by the creation of new institution providing insurance or by the introduction of financial aspect to classic insurance policy 18, and The accounting principle are getting closer (financial conglomerates directive)

However, The nature of the risk and liabilities that each financial institution is still different, even after a great convergence due to specific aspects: demographic issues, scale of operation and structure of liabilities:

The main point is played by the difference between speculative and pure risk, where the last one is the one beard by insurance and it is the most characteristic difference

The prospective of the clients is different “scarifying small certain wealth to avoid the possibility of big uncertain wealth” the source of uncertain in the industry is typical, because is discontinued in nature and typical individual cannot hedge by his own

o This idea came from the functional perspective, where any institution exist to respond to specific needs, that to be solved required a specific function (less volatile) which required a structure to be performed and where competition will force this structure to the most efficient one

o According to this theory the insurance allows to facilitate the entrance of risk adverse investors in riskier market and to guarantee wealth in specific state of nature

The accounting principle have still some difference, the insurance lead principle is the cost basis not the price one, furthermore many scholars believes that a full harmonization will have adverse impacts on insurance business

The securitization process have less touched the insurance industry due to the specific know how required The supervisory authority are different, even if the new framework with the creation of a unique joint committee aims to reduce/eliminate

this difference of treatment The distribution channel are different and they need different expertise, in fact even if there could be a great efficiency in a unique channel

the two distribution force are still too different (especially for nonlife instruments) and may need a large training cost to be done, bigger than the possible synergy in some fields

The liabilities have structural difference due to the underlying risk (pure one) and structural differences related to uncertainty both on time and future exposure

Furthermore, The theoretical tools used by insurance for managing and transforming its own liabilities are different since it is not possible to replicate the cash flow of the insurer’s liabilities or to exploit diversification theorem to reduce risk The tools used are based on statistical rules based on Risk pooling (Law of big number and central limit theorem) [the accidental losses to

which the group is subjected become predictable within limits] to reduce the underwriting risk (the one that the actual payoff looks different, it is a speculative risk), divided into

o Cash flow risk: It is the one related to any error in the estimation of Prob or Losseso Timing risk: It is the one related to errors in estimating the actual timing

The goals of the insurer is to build a good liabilities “portfolio” not a good asset one, hence the core risk is the underwriting risk not the mismatch one (as the crisis have shown the key problem in bank is related to valuation)

The hp of this solution is that by collecting big sample we ensure normality and a convergence between first and second moments among sample and population

17 Active managing the reserve, increase diffusion of unit-index instruments, usually this activity are outsourced18 The insurance can use the larger bank base to increase its own business

21

Page 22: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163o However since the sample may not have a unique payoff diagram (homogenous), the knowledge of loss distribution and there is

correlation among them (reducing the benefits of pool sampling) The presence of undertaking risk (different form financial one) is crucial and is based on the presence of risk of two new risk factors:

insurance dimension and timing of the future outflowo The bigger the insurer the lower the risk especially in presence of low correlation among class of risks (which is the opposite for

financial risk)o The timing problem is reducing by applying bigger sample to reduce the window, but since it isn’t perfect the insurance cannot

hedge their exposure (investment risk) , all their calculation are computed over expectation The insurance can use risk spreading (insurer different group) to reduce risk exposure or reinsurance agreements The absence of a secondary market due to structural characteristic (the holder cannot sell its insurance and will receive the payment only

by the issuer) furthermore liabilities and asset are not perfectly matched, hence they cannot sell any time the asset with the related liabilities

o This problem is related to the lower rate of securitization in this industry (lack of know how) however some risk can be pooled tougher and sold in the market providing low correlated instrument with market

o As a consequence of less securitization the insurance industry doesn’t come up with an originate to distribute business, on the contrary the insurance have less relied on external capital for daily activities

The usage of hedging instruments can destroy value, since it is impossible to define an a-priori hedging strategy. It is better to manage the valuation process of insurance by taking specific assumption about future dynamic The VaR approach is pure performing in insurance due to lack of linear relationship between risk and maturity (the roll over and expected

duration pose severe problem on assessing risk)

Some definition provide on premium and its calculation:o Fair premium: It is the one to cover its underwriting risk, opportunity cost of its allocated capital as well as the administrative

cost. It is discounted back since the payment is paid in advance. There is a trade-off between the needs to be competitive to collect enough data and the needs to compensate the capital for UL

o Pure premium: It is the one to cover the expected losso Expected claim cost: It is the weighted average of future payoff [remember the limitation compared with utility function]

A new risk paradigm

Problems related to the crisis:

Risk transparency inside the corporations between Risk Management (RM) and board was un-sufficient:o Weaknesses in basic risk infrastructure : significant data availability and quality challenges, but there was an overreliance on

complex mathematical models rather than insight into potential future risks. o Non-exhaustive quantification of risks and silos view of risk types especially in the board room there was limited understanding

of liquidity, capital, and accounting implications Risk ownership consists in the incapability for same financial institution to be able to understand and properly manage their risk. This un-

appropriateness is caused by:o Major flaws in strategy : decision to move into new businesses not based on profound understanding of potential risks inherent to

that businesso (Lack of) own skills in the organization not considered or overestimated – leading to false assessment of risk appetite/risk-taking

capacityo First and foremost: lack of top management understanding in the boardo Market bias : "If my peers engage in this it cannot be wrong" – anxious not to miss the opportunity

Governance and structure limitation have blocked the institution in their reaction to the crisiso Insufficient information flow : parts of the organization did expect severe repercussions but did not inform the board in timeo No clear risk responsibilities for key topics around capital (too much socialization of the responsibility); liquidity falling between

CRO and CFO and limited integration with BUso Businesses not ready to act : since accountability for risk evaporated the decision process was too slow o Skill deficiencies across the entire organization – most prominently including the board

Culture and incentive have facilitated adverse action

22

Page 23: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163o Misaligned incentives on all levels from shareholder to individual desk, incentivizing risk taking in general and market conformity

in particular rather than mitigationo Culture of fear to "take personal risk" and expose one-selves through bad messages and/or actiono No consistent and explicit risk culture : lines of defenses from front-line mitigation to capital not laid out, established, or workingo In particular lack of appetite – early in time – to take bold managerial action to divert from the mean

Regulations have been treated as “adversaries” rather than facilitators of stable marketso Compliance behavior focusing on fulfillment of requirements (e.g., Basel II) rather than proper risk management overallo Regulators' skills kept at a distance , "trained to the extent deemed helpful" but not viewed as a partner in stability for effective

marketso Systematic arbitrage of regulation – especially on optimization of regulatory capitalo Reactive approach to regulation : shape both policies and skill building with regulators for short-term benefit of bank

All these problems have caused:

Loss of confidence among the RM due to lack of understanding discontinuities and lack of a general idea on which model should be preferred (which balance between bold and conservative?)

A new business environment where authority will reshape all we do and there will be an increasing pressure to bring our risk management to best-practice levels

A new RM approach is coming out, it is more focused on efficiency on communication and assessing risk, there will be a new risk paradigm

How the new RM activity looks like after the crisis:

o Risk Transparency and insight: From models to insight and foresight and from daily VAR to structural risks Greater levels of risk transparency will not come from more complex modeling, but from understanding the positions and the

portfolio, taking a forward-looking approach to risk assessment and its accounting implications, and exercising management judgment on market behaviors. Structural risks will constitute a completely new risk category

Risk identification & understanding: build insights into all relevant risk to build a comprehensive view to ensure an adequate aggregation using properly estimate corr matrix and to understand the accounting and P&L consequences

Risk foresight: develop early-warning KPIs for structural risks and bubbles Risk modeling: test the resilience of the business against specific scenarios Risk tracking: build new MIS focused on decision support Risk infrastructure: ensure adequate robustness

o Risk ownership, the management need to build explicit strategy accounting for which risks are undergoing, how to mitigate their effects and which is the institution’s risk appetite

Natural Ownership; it is crucial to undertake only risk that are understood and to participate only in market on whom there is competitive advantage in managing risk

Comprehensive analysis on business unit level across all types of relevant risks Gap analysis against Business inherent risks Regular benchmarking Against competitors

Lines of defense: it is crucial to have an in house process to assess risk and to immediately take action in stressed situation considering:

The resilience of the business: diversification and flexibility Own capability to assess risk: front line force skills are central The financial strength to undergo with losses: forward looking and peer comparison

Risk capability and appetite: Define specific action programs to increase robustness by implementing no-regret moves, to become risk leader in selected areas, and to improve resilience by creating extraordinary flexibility to benefit from risk events

Selection of those risks that are generally acceptable against target business model and actual/potential risk ownership advantages

Dynamic setting of acceptable risk exposure and necessary mitigation depending on market environment to avoid unwanted risk and to exploit opportunities

Definition of actions to shape risk return for optimum upside and reducing downside impact in different market situations

23

Page 24: Risk Management: Bank and Insurance

Giulio Laudani #24 Cod 20163o Governance and structure must be shaped to mirror the new paradigm creating formal mechanisms to foster debate about the evolution of the

market scenarios and to fully leverage the information within the organization At the board level: there must be an active role (and well defined) in risk management and to properly check the maintaining of the

overall risk within the target At the managerial level: the decision making process must be less deterministic, it must look forward a scenario approach. It should

aim to optimize the risk taking process toward an higher transparency in the identification, a clear strategic view on the evolution to properly plan/budgeting the business

At the daily institution activity: it should be less cyclical dependent, hence more forward looking (more macro oriented) and to be properly connected with the overall RM unit

The key elements of the structure (given the governance scope): Transactions Risk Management Organization; right decision (consistent with the plan) Risk Operations Organization; rules must be put on place to manage operations Risk Monitoring and modeling: set limits and usage of appropriate model Strategic and Managerial risk Organization: all the level must be aware and properly set

o Culture and incentive must be adequate to the new world of uncertainty, capital shortage, higher and persistent volatility, regulatory change and sovereign risk. This area is the one that needs more investments to ensure alignment between level and institutions/policyholder

Risk cultural assessment: define common language and properly asses risk for the persistency of differences Risk culture enchantment programs to strengthen the cultural environment Alignment of core system with incentive scheme, communication between Bus and carrier incentive

o The regulators reaction was to provide guideline regarding three main blocks: Structure

Banks "should have a risk management function […] a compliance and an internal audit function, each with sufficient authority, stature, independence, resource and access to the board"

Banks should have an independent CRO, "at least in large banks distinct from other executive functions and BU responsibilities"

CRO should "have direct access to the board and its risk committee without impediment" CRO should not have any responsibility in respect of any operational business lines The risk management functions should:

o cover all risks of the banko "be sufficiently independent of the BUs whose activities and exposure it reviews"

Managements processes "Risks should be identified, assessed and monitored on an ongoing firm-wide and individual entity basis" "By properly positioning and supporting its risk management function, a bank helps ensure that the views of risk managers

will be an important part of the business decisions" Risk function should be involved in:

o The approval processes for new producto Assessing risk that could arise from M&A

Effective risk management requires robust internal communication about risk, both horizontally across the organization and vertically up the management chain

Governance The board should be supported by competent, robust and independent risk and control functions, for which the board

provides effective oversight" It is appropriate for large and international banks "to have a board-level risk committee or equivalent, responsible for

advising the board on bank's overall risk tolerance/appetite and strategy and for overseeing senior management implementation of that strategy"

All the compensation should be risk adjusted

24