Banking Regulation
Banking Regulation
Lezione 8/05/2020
Principles of Banking Supervision (1)
What are the basics expectations that we have toward banks in terms of risk management and
modelling practices, the focus will be the credit risk. The porpoise is to give two different
perspectives on the same issue: from one side the Central Bank supervisor we can present what are
our expectations toward credit risk modelling to the bank, KPMG can give the perspective of people
really supporting credit institutions in order to improve their risk management processes and
develop credit models.
(Slide 1) Why banking supervision and banking regulation is so important? There are basically
financial markets and their role is to gather resources from people that can save money to other
gentlemen’s that are going to invest and put this money into production. And this process is subject
to asymmetric information issues: you have both moral hazard and information asymmetries
working because people are not able to screen the behavior of the entrepreneur and they are in a
worse informative position with respect to entrepreneurs when it comes to assess the quality of the
investment activities.
So, banks are at the art of these mechanism and they are in between savers and investors, let’s say
that they collect deposits, screen investment projects and monitoring borrower behavior avoiding
the money that get landed are used in the wrong way. In doing this they generate some benefit to
the financial market because they mitigate the asymmetric information, of course for a number of
reasons banks are better place than individuals in performing their institutional role because:
So far, we basically discuss for credit risk perspective it is not the only risk that banks face an another
standard risk that banks face is liquidity risk that we have already explained the reason why this risk
affect the structure of any bank business, we have deposits and we know that we can always go to
the bank and ask for a draw of deposits, while loans tend to be more long term but in order to
finance the economy you need them both. In particular the loans are not so much liquid in fact you
cannot easily monetize them or in a fast way, maybe in this respect this kind of maturity
transformation is really a structure feature of any banking business, in most of the cases banks and
most sophisticated institution also tend to increase the leverage implicit in their business and this is
typically of this additional component. Instead of investing in deposits, they also collect money from
institutional bank on the financial market and use those money to fund their business. What is the
main safeguard against liquidity risk? In the short term the main safeguard is given by cash and
HQLA, HQLA is an acronym that stands for high quality liquidity assets, the idea that you need to
have a liquidity reserve (money or something that you can liquidate very easily, monetize overnight)
in order to fulfill possible shocks that you have on the liability side, so you may have a kind of deposit
loss, you may have losses in general or some additional costs that represent all source of liquidity
risk and in order to protect yourself you need to have liquidity resources, so basically the few things
about safeguard, for the banking business, this cash and the HQLA is the same as equity in the credit
risk, with equity I can cover loss from my non-performing loan on portfolio, from the cash and HQLA
I can manage some compression of my balance sheet due to liquidity households, which is a certain
reduction of my liability that force you to monetize a part of assets.
One important thing to be back again about credit risk: if you have a downturn in the economy the
portfolio of your NPL starts to grow and it becomes that a significant fraction of your resources is
invested for the loans, in this scheme, it is like the equity is basically financing mostly entirely the
NPL, that is like the size of our equity. This is perceived as a real issue by banks because part of your
resources are distorted by the real nature of financing the real economy while they are funding the
position that are in recovery and that are not productive and especially you cannot consider that
resources as income producing, so once you have crises like in the very few years for the Italian
banking system, this is problem because you are using your resources in a biased way, not to finance
the real economy but really to finance expected future recovery, in activities that are already
defined. This problem was so big that basically European institution and all the regulators put in
pale the dedicated action in order to provide banks some incentives in order to provide stocks to
their non-performing loans. They could set strict incentive in order to reduce their stock activity to
better financing the real economy. Liquidity risk nowadays is really important in this covid-19
scenario: the real economy is frozen from repayment from loans and so they are subject to
moratoria, they are delayed payments that means delay from the asset side and by the liabilities
are still after running and all the company are tried, if they have committed lines they are trying to
grow that commitment lines in order to get extra liquidity, third force there are a number of new
financing given as form of new guarantee, as public guarantees will bring an increase of the asset
side as well. So, there are 3 driving forces that are on the liability equilibrium of any institution,
that’s why we are paying a significant attention to this profile.
Also there are different angles at which you can look liquidity risk, two mainly components: market
liquidity risk and funding liquidity risk, if you go there funding liquidity risk is a shock that you have
on your liability side, that maybe there are depositors that withdraw their deposits or reduce the
funding that you have from your activity. From the asset side you may run problems about the
quality of your assets and you are not able to monetize them or you have a shock of HQLA. Name a
typical HQLA is Italian soverate bond, when you have an increase in spread on Italian soverate bond
the price of those instruments reduces and liquidity in this market narrows and so the ability of
selling those products and gets funding is lower and so this is an additional risk.
In order to have a liquidity crisis you need to have both 2 phenomena in the phase because basically
think about you have long term asset and short term liabilities, if you have a problem of funding like
a shock for example, as long as your asset are liquid and they can be monetized in the market you
can still repay back those liabilities, if you cannot monetize then you of course incur into a problem.
So, two sides of the issue but you need both of elements to have a pressure, it was what happened
during March: a sudden reduction in prices in the financial markets, difficulties into monetizing
assets and so then increase liquidity demand to the frozen of the economy, and the ECB plays a
central significant role and it tried to off set the pressure on the market and it was successful. On
liquidity, this is basically the traditional banks that we saw before but actually they are much more
sophisticated and in general there is also a bit of shadow banking components: they try to use
veicola to draw maturity compensation, in general there are some business that tries to translate
veicola, commercial paper into structural funding, it was something of maturity transformation. In
the standard traditional bank environment the deposit (asset) may affect your credit line, in more
complex institutions the possible source of liquidity risks are multiplied, you may have problems
also in the shorts term, you may have problem also in the long term by not issuing bonds. This is
what happened to money markets funds, they fall because they wanted liquidity and so if the
instruments are not liquid anymore because of the market frozen then this conversion is not
possible. One thing to know is that in general liquidity reserve are not done by cash in banks but
they are made of HQLA, I use this ones in terms of liquidity reserve and I use the market in order to
liquidate them in Repo market in the safest possible way and if it is very safe, it is perceived as
money. Cash cannot be used as payments, for banks it has to be in the network because it has to be
given in case of demand of a client otherwise you go in bank run. So, Repo is the standard way in
which banks manage liquidity risks, in the short term they tend to have liquidity in the securities,
those securities can be done as a collateral in order to fulfill the liquidity demand, while other banks
that borrow money from re-selled Repo market they get securities that can be go another client.
Repo liquid market is a good institute for banks, during crises it becomes suddenly illiquid in
particular during 2008 you can sell the assets with the haircut, it tells you how big the %value of the
asset that you can really monetize, in 2007 it was 0 and then 2008 you can get just a bit of your
asset.
Capital and earnings are both so much important: graphic NII/Avg Assets Evolution, the
projection that the interest rates that I can get in the future, so just a certain period. For the initial
part we have the sharp reduction of interest rate and then you have all the banks, so a reduction in
the earnings associated to their asset, but there is also a lo do volatility, there were banks also more
and less affected. Maybe there are several elements, different strategies regarding the interest rate
they are kind of survival in a broken scenario, it’s a key element for the profitability.
Another important graph is the one ‘Sight and Savings Deposits’ that is related to sight and savings
deposits, sight deposits are really the franchise to any banks, so they are a huge part of any bank
liabilities and so it means a lot of costs, the difference of market rate and the client rate, before the
2008 the distance was very big, you were able to borrow at 1% and lend around 5%, there was a lot
of gap. What happened?
After 2008, you have a Sharpe decline in the rates and so the margin sharply declined and it was
compressed by the market crisis and it was no related to credit risk, it’s like there was not any credit
risk associated to that. So, this compression means that the franchising is implicit in the bank
industry and so this is probably the first source of difficulty of commercial banks nowadays they are
not really able to monetize this franchise, it’s the reason why in 2008 other people want to buy here
in Italy. So, negative rate is one of the main issues for banks, prolonged low interest rate as well and
maybe also more difficult to hold.
Operational risks:
is the risk of losses that may results from internal mistakes, inadequate operations and others,
mistakes in general that affect your balance sheet. Cyber and IT risk are very relevant problems
because there is a huge process of digitalization, so there are new risks that are emerging from this
point of view and investing in this sector might be very important. Environment, today people is
much more focused on green economy, trying to reduce pollution. During this covid 19 the main
discussion is all about the operational risks, I want to provide few examples: if you think about Banca
d’italia, it’s working in a remote way 90%, you should do the most part of the work at home, an
important fraction of employees has never been tested, so in a remote setting you need to stress a
bit because banks cannot make any mistakes and they have to change the way they interact with
the clients because you cannot meet them in person. We are in a contest in which we are really
away from the routine and so you need to pay more attention.
Trading actives you need to do the market to market, a lot of controls like social media etc. do not
work very easy in this setting. A lot of operational risk are related also to the data, of you lose data
you can also have problems with the clients and etc.
We discussed about the standard risks about standard commercial banks, banks are not universal
they provide a wider range of services and so they are all different in this way. Banks are market
makers they access to primary market, if you want to go public you are related to banks that deal
with that and there are also more of them that deals with the private part. There are also a wider
range of risks: in particular more risks especially for the banks that are more concentrated and
larger.
Universal banks:
the standard balance sheet of a universal banks and the standard operating income, you gave
trading activities, committed and uncommitted lines from problems. For the operation income you
need to consider that it is affected by any source of risks, the only thing that makes the difference
is the operating profit that is the one more affected by risks. We ask for minimum equity, but what
does it means?
We ask for the minimum amount of capital that any institution has to carry out, you should
remember one number that is 12,5 it’s a kind of leverage business, in a small fraction of equity, take
money by borrowing and then invest than, so the ratio between the asset and the leverage to me
has to be 12,5, so for each euro invested in equity I can have 12,5 euro of asset, putting in other
terms, for each euro that I have in the asset I have 8% of equity, so the 8% is the key element for
the minimum capital requirement (1/12,5) for a banks. Of course, It’s not calculated on the nominal
amount of risk, so for example if you have a customer loans and I have a mortgage, if you have a
hedge fund, I’ll give you a different weight to each typology I’m considering. This is basically the
standard rules for capital requirement, I always ask that the capital with the comparison to the risks
it’s above the minimum set of regulation and it’s about 8%, I do not consider all the risks but just a
subset of it, credit risk, market risks, operational risk and concentration risks, these are the risks that
for me can generate problems, they must be lower than the capital required for banks.
Basically it’s just the same story, we have the distinction the trading book (items with trading
intention) and the banking book, the trading one is subjected to requirements while the banking
book is related to credit risk requirements, the purpose is to have enough equity capital, sufficient
beyond the minimum to offset the risk that rise from the other elements.
So, another important element is that basically: how calculate the risk weight? they can be
calculated in two different way: by a standardize method or by using internal models that leave to
the bank the opportunity to define the internal element and this applies both to market and
operational risks. For little institutions can use both the method, while more sophisticated parts
focus on the internal model. Minimum capital requirement is the sum of 3 VAR: var is a valuate risk,
it tries to measure the maximum loss that a bank can suffer in a certain amount of time.
so in the standard idea that the minimum capital, the amount of equity must be equal to the
maximum loss that I can suffer from all my activities, so it has to be sufficient to cover all the possible
losses with 99,9% confidence, only 0,01% chance of default in one year horizon, this measure is itself
an estimate and so it represents a very easy estimate that is related to the assumption that we
define before. I assume that I have a maximum loss from credit risk, market risks and operational
risk all together simultaneously, there is a very important confidence interval, but there are other
things on the other side that the model does not capture, or there are my mistakes in programming
it, I tend to have even more default. The definition of the variate risk is that the capital has to be
sufficient to cover any loss that I expect to have in that confidence interval, this is the main
regulatory element that is set.
So basically, we just said that capital requirement rely on the capital risk components and of course
variate risk has some critics, there are those 4 areas: Tail Risk, Prociclality, Complexity,
Arbitrageability.
1. Tail risk: if you think about it, standard variate risk, if they can tell you that it’s a quantile estimation,
if things go wrong, you can basically understand that you can define a quantile that tells you nothing
about the shape of a curve, the tail can have both a good or a bad behavior, the certainty of the fact
of having a loss is much bigger than the one that I have estimated in my capital, and we already know
that credit risks are not normal and so the tails are fat. Massive concentration that may change the
rule so in a safer instrument, it can have a lower probability but it can have a massive in the tail. In
market risk I can define the tail because I can define the distribution and so It’s me who decide the
tails or not.
2. Prociclality: The problem about the risk of capital requirement is that there is a source of prociclality
and so in general all the expansions of the assets in the banks were accompanied to the increase of
total liabilities, so this means that in a market the banks tend to have higher leverage and capital
grows in a less proportional way and of course the capital is in level with the sources of risk and so
when things go wrong the situation is much more problematic. Several years of good quality and
growing path, by the increase of a lot of assets and so liabilities, it’s almost impossible that crisis
becomes impossible to hold.
3. Complexity: Basically, if you think about a balance sheet, minimum capital requirement are a kind
of risk weighting the asset as a function of implicit risk. More standard measure is the leverage:
consider asset as a multiplier of the equity. By reducing the implicit risk, you can even more double
your leverage, but you cannot go the opposite. In the paper of Haldane, the risk capital measure ratio
was not a predictor of default and so even banks with quite high capital ratio failed. This is a kind of
standard perception and the reason why you know basically we have risk capital ratio is to avoid the
incentive of banks in investing in riskier position and it becomes higher.
There are also a number of advantages: even the fact that we try to discuss with the banks about
the main ingredients of credit risk, this gives us additional perspective about the bank in general,
and it gives us a better understanding about the portfolio, it allows you to do more important
decisions.
Calculating Capital Ratios
As regard the capital we may have different typology of capital and so they are in function of the
loss capacity of instruments and so they are also a definition of common equity of the banks. In
general, there are also additional instruments required by banks such as: fixed bonds, subordinated
capital. There are different typologies of ratios: we calculated the common equity to one ratio
related to common equity over total NWA but we have also Tier 1 ratio and Total capital ratio. There
are a number of items that must be deduct from common equity calculation and it is very technical
to calculate.
The 3 main blocks are: credit market and operational risk.
Credit applies for bank boom items: loans and receivables, guarantees and they are subject to
capital requirement and there are 2 basic meteorology: standardized and internal model, internal
model the banks have to estimate PD, LGD, EAD. You have an outcome that is about the risk weight
calculation. For the standardized we have an accounting values that are multiplied by risk credit
factors weight that are different from others, so the exposure of balance sheet is different because
of the fact that it is multiplied by elements that are risk weights. Default exposure is calculated by
balance sheet element, if it is below 20% then they go on 150. Internal rate based approach and so
in any case the regulated provides a formula that must be applied and the standard approach plus
this one try to define the amount of maximum loss that you can have under 99,9 of probability and
so by comparing them both graphically you can observe that both the IRB is more sensitive while
the standardized is less and so there is an implicit premium and so you tend to have a better
treatment and if you go in a very good quality you get a higher capital requirement in case of IRB,
while the standardized is still flat
Lezione 14/05
Risk is related to variability in the value of financial instruments. Every time you
talk about risk (in particular about credit risk) you are going to have two different
concepts: 1) expected component of risk; 2) unexpected component of risk. What
do financial institutions do when they manage risk? They try to absorb, mitigate,
diversify, and transfer risk. The idea is that risk generates performance volatility
that have negative impact on the value of financial institution. For this reason,
we try to measure this risk in order to monitor and mitigate the risk itself.
We just said that we have two different components of risk. The idea is that we
have some fluctuations in market value (ex. Share price going high and below a
certain threshold) that we can split into an expected component and into an
unexpected component. We measure the two different components in two
different ways. The same reasoning can be made when thinking about the number
of loan defaults and credit rating downgrades as well as the number of processing
errors: both categories can be divided in two components, an expected
component and an unexpected one which can be computed with different
methods.
When we go through the computation of the economic capital, we can divide
such distribution in 3 parts: 1) expected loss, the bank knows that it’s going to
losing some amount of capital because some people are going to default, some
credit obligations are not going to be met and for this reason we have a standard
risk cost to cover such losses; 2) unexpected loss component, losses that banks
do not expect and then it’s going to compute the economic capital taking into
account the losses on this part of the distribution; 3) then there are the catastrophic
losses, for which there is no capital coverage in the estimation of the capital for
the bank. Expected loss is important for pricing because we need to cover the
expected loss + the risk premium on your capital. When we refer to pricing, it
means when a borrower pays for a mortgage, he pays an interest, that interest is
based on an estimation of the riskiness of its mortgage and that part of the interest
is going to cover the expected component of the risk.
The Basil Committee was established for the first time at the end of 1974 because
of a shock in banking market and also in the currency market which lead to a
failure of a number of banks in the west Germany. The committee tends to
enhance the stability of banking system issuing the regulatory requirements
which are reflected in guidelines and standards that banks must apply in order to
evaluate risk. Before the Basel accord, banks were independent in the risk
management and after they switched from a structural approach to a prudential
approach with Basel 1 which introduced capital requirements and also a
Standardized approach. With Basel 2 we have also the inclusion of internal
models so that banks can compute regulatory capital using internal models.
1988: first accord of Basel because of Latin American debt crisis and introduction
of solvency ratio and internal model are not allowed.
2004: Basel 2 Committee decided to replace the first accord with the introduction
of three pillars, of Rating for the calculation of capital requirements for credit
risks (IRBA) and implementation of new technologies in the market.
2010: Basel 3 response of financial crisis of 2007-2009 during which banks had
too much leverage. So the Committee decided to raise capital and liquidity
requirements. Reforms of Basel 3 have been finalized in 2017 with Basel 4 and
in this context the Committee decided to reduce the variability in the calculation
of RWAs.
Regulators asks bank to set capital resources according to their own riskiness
degree. There are two main reasons determining such constraints: 1) in case bank
faces higher losses it would be able to rely on this provisional resources; 2) banks
may not find convenient to assume too much risk since they would set an higher
capital.
Capital requirements represents one of the first pillars of Basel 2 that banks
should satisfy. We are referring to several risk sources as: credit risk, market risk,
operational risk and counterparty risk. These requirements have pushed banks to
improve their activity to measure risk.
IRB Advanced: internal rating based advanced framework.
Regarding the second pillars, it’s about the supervisory review and we have two
main points of attention: 1) banks internal capital assessment; 2) efficient
regulatory supervision. To complete the second pillar, banks are required to
undertake internal capital adequacy process and it consists in designing and
implementing a risk adjustment framework. This framework must ensure that
banks constantly meet the capital requirements and manages risks that are beyond
the one already captured in pillar 1. This process, the ICAAP, need to be
approved by the board of the bank before being submitted to regulator front
review. Pillar 3 is about market discipline and it lists a set of disclosure
requirements which allows market participants to rely on more information about
the capital adequacy of institutions.
We have not to think at pillars as stand alone but we have also to think about
them as three integrated components which work together.
Credit risk: possibility of losses associated with decline in the credit quality of
borrowers. There are three methods to compute it: standard, FIRB, AIRB.
Market risk: it is related to the fluctuation of the price of financial instruments.
Counterparty risk: it belongs to credit risk since it is the risk that the counterparty
will not live up to its contractual obligations.
Operational risk: risk of losses resulting from inadequate or failed internal
processes, people and systems. We can think at operational risk as human error.
We have two different approaches: standardised approach and internal rating-
based approach (they can be two since there is the fundation and advanced
approach). The idea is that a bank can use the standardized approach in order to
estimate its credit risk, even if it can give a poor understanding of risks and less
understanding of the riskiness of assets itself. The IRB approach has a more
complex structure, but it gives an overall understanding of risk since a bank can
identify the risk associated with each position in a portfolio.
With the standard approach we just divide the whole credit portfolio into different
classes using a different segmentation and we apply different weighted
coefficients to each of the asset classes. With those coefficients we can identify
the riskiness of the counterparty and we can assess the credit rating of each of the
subdivision of our portfolio.
With the internal rating-based approach, we use an our own model through which
we estimate the risk parameters. Such parameters are: PD, LGD, EAD and M.
With Fundation IRB we just estimate the PD, while with Advanced IRB we
estimate all parameters.
The k required for any given loan only depends on the risk on that specific loan,
so it does not depend on the portfolio to which it is added to.
Default risk is the risk that a counterparty will not be able to fulfil contractually
agreed financial obligations due to its default.
Migration risk is the risk reflecting the potential reflecting the potential loss due
to changes in the fair value of credit exposures as a result of some borrowers’
rating transitions. A borrower that at time 1 is rated BBB and in the next period
he goes in default. This is related to migration risk.
Country Risk is the risk of default on any foreign debt repayment of principal
and/or interest owing to developments within a country that affect its
creditworthiness. We can refer at country risk thinking at spread between two
countries.
Concentration Risk in Credit Portfolios is the risk of suffering extreme losses
from an uneven distribution of exposures to counterparties, from contagion
effects between borrowers, or from sectoral concentration (industry,
geographical region, etc.). We can think at the case in which a bank has to decide
which counterparty to choose and the best thing bank can do is trying to diversify
its loans.
Residual Risk in Credit Risk Mitigations is the risk of the bank’s failure to realize
the financial worth of transactions intended to mitigate credit risk. It’s the typical
risk when a bank has to liquidate for instance one default position so for example
they have to sell the house granted for the mortgage and they don’t receive the
price they expected to receive when they build the loan.
There are different kinds of segmentation. Each credit risk segment can be used
as a basis for the model estimation. LGD for example can be estimated for large
corporate portfolio, foreign corporate portfolio and so on. So in order to
understand what we are doing with our model we need to make together this
segmentation with the four parameters we were talking about (PD, LGD, EAD
and M). Those segments are going to identify the underlying for our estimation.
The column credit risk segmentation is the one we refer to when we apply our
credit risk models.
Then there is the Basel segmentation, that is the one which the regulation refers
to different portfolios of the bank.
Then we have the commercial segmentation that is the one used for the pricing
of the assets and for the credit risk policy that bank applies to different portfolios.
We work on the first segmentation.
The definition of default is a key concept of Basel 2 and regulators in the last
years also set up new rules for the definition of default. EBA define default as a
situation in which an obligor doesn’t repay back a loan or a mortgage. The
position is named default when the borrower is not matching its credit obligation
since more than 90 days. For past due we need to violate some threshold that are
two: 1) absolute threshold, if the unpaid amount is higher than $100 for a retail
exposure or $500 for not retail exposure, we have the first signal of default; 2)
relative threshold, that is a ratio between unpaid amount over the entire exposure
a borrowers has. If both criteria are violated, a position is classified as default.
A position can be classified as default even without overcoming these two
criteria. In fact there are also subjective criteria (or unlikely to pay criteria) in
case a borrower decide to restructure a debt or a loan.
The bank can also experience a material threshold for example if the NPV of the
loan may decrease by more then 1%, a bank can classify such borrower as
unlikely to pay.
Of course the regulator also describes the rule for infection logic: if the borrower
is part of a group (cointestatario), the borrower itself can default as a group since
the default propagates to such position. So not only the group is in default, but
also the single obligor.
If a borrower is in default can also be cured coming back to a performing status
but such borrower is under observation for three months. If in these three months
the borrower doesn’t overcome the two materially thresholds, he can go back to
performing status.
Treatment of multiple defaults: if a borrower is in default, he can go back in
performing status, but it can also default again. In that case if the second default
event happened after 9 months after the end of the first default event, the
borrower is considered as default but the two events are independent each other.
In the opposite case, they are considered as one default only.
Banking regulation
Lesson 15/05
1. Some quantitative indications (how many days you are Past Due, for which amount you are Past
Due)
2. Some qualitative indications (if there is an indication that your situation has deteriorated such that
you are not able to repay in future).
These 3 elements: difficult to identify, quantitative criteria and qualitative criteria are the core of the
definition of default.
Since the beginnings Italian banks are classifying default in a different with respect to the one, we are we
are looking at right now, there is a European definition. W e were used to model probability of default with
slightly different way because the definition of default itself was different. Now the European authorities
are identifying a unique way to detect default to model all the events in the same way across Europe. All
the banks in this moment are trying to be compliant with the new definition of default because it is not
easy to switch from the previous definition to the new one because there is an IT effort, banks sometimes
does not have all data to be reconstructed the vents in the past. It is a crucial topic in this moment, all the
banks should apply the new definition of default from January 2021.
So, coming back to the slide (30), we would stress again the fact that there is a difference from performing
status and all other status which are defaulted status but for the Past due there is a counting of number of
days but you can also classified for other reasons, as subjective criteria in which, for example the credit
manager of the bank has evidence of unlikeliness to pay. You can also classify directly the unlikely to pay
criteria without passing for the past due. This differentiation is important for the modeling of the loss given
default. We have 3 parameters to calculate the RWA (PD parameter, the probability of default, the loss
given default parameter and IAD parameter, the exposure of default). Just for the LGD is better to
recognize all the 3 status because the Past due and the unlikely to pay, we recognize in that way, but they
are two status were the counterparties can be cured and can go back to the performing status. The
litigation cannot be cured, it is the last part of the process were the bank starts, for example for the
mortgage the legal process to sell the apartment, the collateral of the mortgage.
To summarize:
As soon as we go from past due to unlikely to pay the situation goes worse and worse, when we are in the
litigation status, the loan cannot be cured, we cannot go back to performing. This is the crucial element to
model the loss given default. Why? Discussing about the probability of default model in the exercise we are
not taking this three cases (past due, unlikely to pay, litigation) as 3 different cases, we do not care about
the difference status of default because we are just trying to model when a position moves from
performing to default.
The obligors can be defaulted or not. The litigation, for example, is when a company goes to bankrupt. It is
a status in which the company does not have the possibility to pay back the loan to bank but also to survive
since it is the last part of the company’s life.
We must consider both the relative and then the absolute threshold and they must be broken
simultaneously. Just looking at the Past due, we can make an example:
Let us suppose a bank is loaning 200 euros to a counterparty and that the unpaid part of the loan is just 2
euros. If it is not paying the debt for more than 90 days, would you say it is in default? No, because we are
not about the absolute threshold.
How much should be the amount to be set as default position?
There are two elements: absolute threshold and relative threshold. The absolute threshold is let us say 100
euros and the relative threshold is a 5 percent. In case the loan is 1000 euros, how much should be the
unpaid part to be set as default? Remember that both must met to consider the person as default. The first
question is: Did I match the absolute threshold? At least 100 euros have be unpaid. Second: is the unpaid
amount 5% of the total loan?
If we check that the absolute value is greater than the 100euros, we are sure that also the relative
threshold is met in that case.
The definition should be harmonized amount all the institutions because it can vary a lot and it is not
unique for each bank and then you can model different events. All the bank should model and apply the
same definitions. What we should really understand is that as supervisor they have very strict identification
of default, the problem is that you classify in default position that are still alive, since position that are past
due or unlikely to pay in most cases are still doing business and this makes the measurement of the loss
given default (when you go in default) quite difficult to measure. Some additional complexity in the
estimation of your parameter and this is the complexity you must face when you go in estimating those
models.
Slide (32):
We can then have a look on the theoretical concept of the PD model estimation. The estimation of the PD
model can be summarized in 3 steps:
We have the modules development: probability of default can be explained by some risk driver or criteria
that represent the specific feature of the position which belong to the portfolio you want to estimate the
PD. Yesterday we said we can have different portfolio, for example, we can have a portfolio of more
corporate, business; retail portfolio and each of this portfolio can have different modules because not all
the feature are equal along the obligors.
The second part of the module development is about the development phase in which you want to
estimate the different models and to estimate the output of the model for each module.
We want to understand if a performing counterparty, that at the moment is paying the loan, if in the future
is going to default. This is what we are trying to do. When we say that different modules want to be
assessed is that there are different features, I cannot say that all the counterparties belonging to all
portfolios have the same characteristics. There are different characteristics that we can detect using risk
drivers. Let us split this problem in parties. Imagine we are working with a corporate portfolio; you are
giving loans to corporates. In this case is important to analyze the financial modules and want to assess
which are the financial drivers of those corporates. I can look at the total assets or the social demographic
of the counterparties, as the geographic position or the kind of counterparties. There are a lot of elements
to look at. There are also risk drivers that I can close in the module called internal behavioral. How is this
counterparty behaving with respect to bank? Is this counterparty paying all this loan? If no, how many of
this is not paying? So, we want to identify from different perspectives the behaviors that the counterparties
are taking with respect to the bank and we split these elements in different modules. Each of those
modules are gonna be single modules and are gonna estimate a logistic regression for each of those
modules to identify what is the final score (the output of the logistic regression) related to each of the
topics. Before the logistic regression you can also study that variable, you have different modules for
different futures. If you for example have a corporate portfolio, what is important is to study the financial
statement of the corporate. You can study the different variables, or you can combine the variables to
study, you can do it also from a statistical point of view. You can study the trend of the variable, the
correlation among the variables you have in your list. You start with a list of all the variables needed, which
can affect the risk of your portfolio and then step by step you select the best variables with some statistical
criteria. What could be a statistical criterion to select the variable? Thinking that you must choose among a
lot of variables, what can you do to discriminate among all of them? (to credit the risk). Not the variables,
but the major to do it. Imagine we have a lot of those indicators, all of them good. I have let us say 10
drivers, what can be the way I measure the performance of those variables to say which is the best
indicator? The statistical significance, as the T- stat, that is the univariate analysis, you run your variables
against the same dependent variable (the economic variable you also have in the development phase or
logistic regression) and you look at the p-value, the significance of your “beta”, if it is not significant you can
discard that variable.
We take all the variable belonging to the initial list, as 20 variables, and we test each of them running a
single regression for each of the indicators. The independent variable stated as (0,1) as default and non-
default and the indicator is one of the 20 variables I have in the initial indicator. I run 20 regressions and I
look at the results, as the statistical significance, so the p-value; the second is the performance of that
model that is a univariate model because it has just one independent variable against the target variable.
The first step is to run those univariate regression, to see which variables do not have an explanatory
power for your dependent variable and the second step is to see how the variable interact between them.
Normally in a model, the more variables you have more the model fit. This does not work with the PD
model; the number of variables cannot be that much high. You need to look at the one that mean
something economically speaking, not just the most powerful variable statistically speaking. The variable
should have a good trade off between statistical significance and its economic sense. When you have a lot
of variables you do some steps automatically and you could have some variable with any economic sense,
but at a certain time you look at them one by one, to understand them.
The dependent variable is a flag (1,0). In the sample you have just some snapshot or end of year, and you
look at your portfolio at that time but the variable does not reflect the status of that obligors at a given
date but reflect if the client will default over a period of time.
To refresh the concept:
I take a reference day, I put myself, for example, at December of a certain year and I look at the next year. I
have a window for my observation of one year. At the first day a look at all the counterparties are
performing, they are paying their loan, the I look at one year later asking how many of those counterparties
will default trying to model this flag taking the variable one year and the one year later, this is my target
variable and I try to estimate this probability of migrating from performing to default next year. We call this
window snapshot; we say that we want to model the PD over 5 years time horizon. I take the
counterparties in let us say 2012 in which they are performing, and I ask if they will default the next year.
The counterparties that will default are flag with 1 in 2013, I do the same with the next years. After the 5
years I have my development sample, the sample I am looking at to estimate and I have my default flag for
the 5 years. I do all of this for all the counterparties in the portfolio, then I want to predict this flag that I
defined by looking at different risk drivers. We have the target variable and we want to estimate it.
Making an easy example: If all of them are performing at the first year being all 0, the year after are all 0
except for 3 of them, so the default trade is 3% (3/10). Now I want to estimate this number, I look at all the
characteristics of all the counterparties ad I try to use those to predict the default. If I recognize that the 3
that went in default had the variable x that can take the value a or b was always classified as a. Looking at
the risk driver, I have the risk drivers x, with value a or a, that had value a when the counterparties has the
default flag equal to 1 and vice versa. In this case that variable is perfect, since is useful to predict the
probability of default.
Let us say this is our reference date in which I put myself to identify where is my default flag next year.
In that state are the counterparties are performing, with default flag as 0. The I put myself one year
later and I look at the result on the default flag that could be either 0 or 1.
Now we want to look at the indicators, the risk driven we are looking at. Imagine we have two different
families of drivers: financial and behavioral model. In the financial model we take some variables as the
ratio of debt VS bank/turnover and the net income over the total assets. All the counterparties
belonging to my sample have this value showed in the column for that variable meaning that I can look
at the two ratios for all the counterparties belonging to our default. To understand if the driver chosen
are good, I run a logistic regression in which I put the default flag as target variable and the ratio
considered as independent variable. I can see what the estimate of the parameter of the logistic
regression in order is to analyze the coefficient and see if are significant, then if yes, it is a good driver
to estimate my probability of default.
Just doing a little focus on the variable selection and univariate analysis we have already talked about.
We can find this two issue in all models, also in LGD models and IAD models and the vary step of
variable selection in which you are making the univariate analysis is cross each of this one but the
construction of the developing the sample is different from the models, some of the step from which
you are finding the predictors are shared across the models.
What do you expect as the relation with the probability of default if you are talking about the earnings
of a company as a regressor? The lower the earning the higher the probability of default. Having that
type of data sample is suggesting us to analyze also other factors. If one year you have 100 earnings
and the following years 80 earning, that means are decreasing. With respect to this earning the
company is sustaining less cost, it is likely that the second year will be better than the first.
The same argument can be set also if you deal with the ratio of earning over total assets.
What do you expect if we are talking about the volatility of the cash flow of the corporate? (set as how
much they are changing in financial statement of the years). An increasing volatility will be negative
with respect to probability of default because a greater risk of a sharp drop in the cash flow next year if
the current year the corporate has a healthy cash flow. The higher the volatility, the less the stability of
the cash flow and so you can link stability with less risky corporate and higher volatility to riskier
corporate.
For the behavioral model we just select two different variables, one is the number of days in overdue.
We know this client are in performing status, but at the same time the number of days in overdue
could have been already counted. If this number is higher, is the client risky or not? It is riskier, more
the number in overdue, more the possibility to flag as default to the client. It is really a good indicator.
The behavioral model gives you just the behave of the clients according to the accounts, the mortgage
and all the lines in the bank. The second driver is the utilization rate, imagine I have a credit card with
1000 euros, but you can use just one part of that amount each month. If a client uses more than what
the bank granted to me, which will be the relationship between the utilization rate and the default
rate? The higher the utilization rate, the higher the probability of default. That is the reason we need
the variable with economic sense. The output of these 2 models is what is seen on the file, is the score.
The score is the output of the logit model, it is a probability of default transformation. The formula is in
the blue box.
We do it with the two variables for both model having the final scores that is the output of the logistic
regression. What we had at the end are 2 risk drivers. Then you can go through the second face in
which you have the output of all the modules, we want just to combine all this output in order to have
a unique score for each unique output for each unique counterparties. In order lo combine you use
again a logit regression and you put together the models, but just the scores that are the output of the
logit regression over you use the single risk drivers. In this way what you get is the integration score
that represents the sum of all the modules of all characteristics of the futures you used to model the
probability of default. The output is the integration score, as I have already said, and then you can use
the formula on the blue box to have the integration PD.
Repeating it:
Imagine you got the final score (output of the logit regression) from the two different modules. We run
a logit regression on the financial model and the same on the behavioral model and the get those two
final scores. Now we want to combine them into one model. We run a third regression in which the
indicators, the risk drivers, are not the same as before, but are the final score itself having a final value
called integrated score. The you transform this score in a PD value, finding the integrated PD. In the
formula the argument of the exponential is what we call score.
What is missing is the calibration face in which we want to reflect in the PD of the customers the long
run average PD, we can do it following 2 steps, in the first we have what is called the central tendency
that is nothing different from the long run average PD of the portfolio. Then you use a calibration
function to minimize the distance such that in your portfolio the probability of default is this central
tendency. The last step is to construct the rating class and we can do it by statistical method (as K-
means method) and you construct your rating class, you just assign each observation of your sample
the rating class which minimize the distance between the average value of the rating class and the
value of your PD (the best class has lower PD). I can map those PD on a scale to rating them.
Now run the logistic regression and look at the solution of the excel file.
BANKING REGULATION, 21/05/2020
Given that we are already in default, we don’t care about the performing status, but only of
past due, unlikely to pay and litigation. We can differentiate among these three with different
criteria that are worst and worst when we go from past due (first stage) to litigation (third
stage). We can define the parameter of the model, that are:
• Migration rate: (this one and the next are linked to past due and unlikely to pay). This
model component is useful to model the probability of migrating from one status to
another. The migration could be in one direction, from best to worst position, or in the
inverse direction.
• Exposure variation: it is linked to the same concept, but we are not talking about
probabilities but how does the exposure change when I move from one stage to the
other
• Loss given litigation: (this one and the next one are referred to litigation status). It is
aimed to taking into account the realized loss when we go thought litigation.
• Adjustments: aimed to take into account some phenomena that cannot be perceived
just looking at the positions. Like, if there is a downturn, I want to adjust my estimate in
order to taking into account also this element.
MIGRATION RATE
We have said that this is the probability of migrating from each status to another. Starting from
performing, we can arrive to default even arriving to past due or to unlikely to pay (that are
intermediate steps) or directly to litigation. For the intermediate steps we can go even worst to
litigation or becoming again performing. There are also some counterparties that are not matching
their obligation, but than they return to pay. Once I’m sure that they are able to repay their
obligation, I can classify again them as performing.
We update the data each and every time we have to use the model from the beginning, otherwise
we recalibrate the model, including new data and rerun the same model to see how the
parameters change. Not happen more often than 1 year. We have also to update the data every
time there is a change in the rule by the European law.
We can go from performing to litigation directly if for example the debtor goes into bankruptcy or
dies and there is no one that can repay his loan.
• Sample identification: -First entry, we have to identify in which position the clients are, so
accordingly to the graph, in which branch we are. -Cure rate, when the position is in a
default position and go back to performing, starting to pay his debt. -Danger rate, the
probabilities that they go to a worst situation (litigation).
• Estimation: we can think about measuring the frequency that each counterparty move
from one status to the other.
Excel Exercise
First default classification: Past Due or Unlikely to Pay or Litigation.
Last Default classification: either Litigation or going back to Performing (-). Litigation is the only
absorbing status, because you cannot change your situation. So, the complexity is that the other
positions are not absorbed.
In real life often happen that the parties go from PD to UtP to Litigation, while in the example is
only a direct shift from one position to another.
One party that for example goes from PD (or UtP or L) to litigation is facing a Danger Rate. If
instead one party goes from PD (or UtP) to performing, he is facing a Cure Rate.
-First question: we have to count how many counterparties are actually in past due status (56).
The same for unlikely to pay.
-Second question: the past due danger rate is the ratio between the people who have a migration
𝑃𝐷→𝐿𝑖𝑡𝑖𝑔𝑎𝑡𝑖𝑜𝑛
from PD to Litigation over the total PD , how much counterparties have migrate
𝑇𝑜𝑡𝑎𝑙 𝑃𝐷
to the worst default classification.
𝑈𝑛𝑃→𝐿𝑖𝑡𝑖𝑔𝑎𝑡𝑖𝑜𝑛
For the Unlikely to pay is the same 𝑇𝑜𝑡𝑎𝑙 𝑈𝑛𝑃
-Third question: the past due cure rate is the ratio between the counterparties that are becoming
𝑃𝐷→𝑃𝑒𝑟𝑓𝑜𝑟𝑚𝑖𝑛𝑔
performing over the total PD (or 1-DangerRate) 𝑇𝑜𝑡𝑎𝑙 𝑃𝑑
𝑈𝑛𝑃→𝑃𝑒𝑟𝑓𝑜𝑟𝑚𝑖𝑛𝑔
For the unlikely to pay it is 𝑇𝑜𝑡𝑎𝑙 𝑈𝑛𝑃
EXPOSURE VARIATION
In this case do not consider the performing status, we are not talking about probabilities anymore,
but we are looking to exposure. Imagine that the exposure to my mortgage is 324 euro and I start
not paying the rate, then I’m classified as default. If I pay only 100, my exposure is changed. My
exposure can change when I go from one status to another.
The important thing is that we are looking only to default positions. This because the exposure
from performing to past due (or utp) has already taken into account in the exposure at default
(EAD) parameter.
• Sample identification: Cure rate, as before to performing; Danger rate, go to litigation. The
only difference is that we are not looking at probabilities.
Excel exercise
We look at the performing exposure when we want to estimate the cure rate and at the exposure
at last default classification for the danger rate.
The exposure at the first classification that is equal to the exposure at last default classification
means that there is no exposure variation.
If the exposure at last default classification is higher than the first, it could be that the bank is
given money to the firm that is already in default, without being aware. For the residential
mortgage, the client should repay for example 318, but he is in default, he cannot pay. When he is
classified as litigation, the amount is higher because the bank has started some process to get back
money, so they charge more cost or simply the bank adds more interests.
If the exposure at last default is lower thane the first, it could mean that he has repaid some
instalment, but it is not enough to be classified as performing, because we have some threshold to
consider. There is probation period in which the party is paying back the instalment, but I want to
be sure that he will pay also the future ones. If you do so for the entire probation period, you are
classified again as performing.
ADJUSTMENTS
These are elements that are not clearly observable in our data to adjust the previous results.
-Downturn component: for example, at certain point in time, li 2008 after the crisis, when they
work on the model, they realize some loss rate really higher than the other realized on the entire
time series , so the idea is that they want to isolate this effect to see the impact on loss given
litigation. See what is realized in these worst observations and use it to adjust to add something
more.
-Open positions: we are talking about closed position until now, but there are other positions that
are in default even not closed. Like a debtor who has a mortgage and he starts not paying, so he is
classified as past due, then as unlikely to pay, the in litigation but the bank is still waiting for sell
his house. What do we do with these positions? We still want to know what these positions are
telling us.
We take into account the most recent position to see their loss rate. Like we detect that people
from the nord are riskier than the south. You look to the portfolio of the bank and it is composed
only by south people, so the real portfolio is really riskless. So, at the bottom we want to see if the
positions that are not closed but still exist in the portfolio of the bank. What are the characteristics
of positions that are not taken into account into the development sample but still exists in the real
applicant portfolio of the banks.
*risposta alla domanda di luca riguardo l’applicazione di questi modelli anche per altri scopi* All
model talked until now are all for capital estimation purposes, but some transformations are also
used for managerial purposes, like pricing the mortgage, the interest. These can be used also for
provisioning, so how much is the expected loss component of the bank or identify some target of
recovery. For instance, sometimes the bank assigns the recovery target to the agent that is going
to assign the recovery activity on the basis of the actual recovery estimated though the model.
-Disposal: once the bank has accumulated a lot of non-performing loans, it comes to you and says
that you need to sell some of them to other recovery companies. In this case the bank realizes a
huge loss rate, because they have to sell with a low price and so a low recovery associated to that
position. The disposal is not comprehended in the estimation because it is likely a strange position,
associated to loss rate strange and for this reason we are taking into account after to not lose
these information.