0% found this document useful (0 votes)
121 views29 pages

Banking Regulation

The document discusses banking regulation and supervision. It explains why banking supervision is important to mitigate issues like moral hazard and information asymmetry. It also discusses the tools used for banking supervision like capital requirements, disclosure requirements, and activity restrictions. The document notes that supervision has become more intrusive since the 2008 financial crisis, with increased stress on risk culture and a follow the money approach rather than just risk exposure.

Uploaded by

Angela Cerqua
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
121 views29 pages

Banking Regulation

The document discusses banking regulation and supervision. It explains why banking supervision is important to mitigate issues like moral hazard and information asymmetry. It also discusses the tools used for banking supervision like capital requirements, disclosure requirements, and activity restrictions. The document notes that supervision has become more intrusive since the 2008 financial crisis, with increased stress on risk culture and a follow the money approach rather than just risk exposure.

Uploaded by

Angela Cerqua
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Banking Regulation

Lezione 8/05/2020
Principles of Banking Supervision (1)
What are the basics expectations that we have toward banks in terms of risk management and
modelling practices, the focus will be the credit risk. The porpoise is to give two different
perspectives on the same issue: from one side the Central Bank supervisor we can present what are
our expectations toward credit risk modelling to the bank, KPMG can give the perspective of people
really supporting credit institutions in order to improve their risk management processes and
develop credit models.
(Slide 1) Why banking supervision and banking regulation is so important? There are basically
financial markets and their role is to gather resources from people that can save money to other
gentlemen’s that are going to invest and put this money into production. And this process is subject
to asymmetric information issues: you have both moral hazard and information asymmetries
working because people are not able to screen the behavior of the entrepreneur and they are in a
worse informative position with respect to entrepreneurs when it comes to assess the quality of the
investment activities.
So, banks are at the art of these mechanism and they are in between savers and investors, let’s say
that they collect deposits, screen investment projects and monitoring borrower behavior avoiding
the money that get landed are used in the wrong way. In doing this they generate some benefit to
the financial market because they mitigate the asymmetric information, of course for a number of
reasons banks are better place than individuals in performing their institutional role because:

• There are economics of scale (lower monitoring costs than individuals)


• With their clients they may have long term relationship, it’s not a one-shot game but a
repeated game that get more comfortable equilibrium
• Together with the lending services, the banks provide also additional ancillary services like
payment services, liquidity, and deposit facilities and those are additional information on the
behavior of their client
The standard classic literature says that you cannot eliminate the moral hazard issue, and there is
one more issue remaining that is not clear who is going to monitor the banks. Nowadays there is a
lot of discussion about the quality o the management of the banks and it is part of the moral hazard
problem. So, since there is this moral hazard, the depositors ask for being protected and, in any
system, there are both implicit and explicit safety net protections. In general, any savers perceive
their deposit as risk-free, not to be exposed to the risk of default of an institution. These that are
risk free don’t ask for risk premia in that respect, it exacerbate the moral hazard issue for the banks
that care implicitly incentive to take extra risk, that increase the profitability of the business as long
as the risk is not priced by the depositors. That is why there is the supervision, that minimize the
cost for protecting depositors. To avoid that a problem in the management evolve in a loss for the
depositor.
(Grafico Negative Externalities) We already said that bank failures generate some negative
externalities. So, what happen when a bank fails? It’s a bad message for the entire banking system
so one bank failure may mean that also other banks can be in a similar situation and this means
additional costs for the rest of the system, and this is a negative externality, and banks in addition
can be seen as a kind of cumulate amount relation for the client so it’s really a capital of information
with respect to final clients that help them to access to funding. If a bank fai these clients may lose
this information in long term relationship and this is an additional negative externality. There are
also direct interconnections: derivatives, OTC systems, interbanking system, makes the banks
intertwine each other. So, there are a lot of possibilities that a failure of a bank affect other that are
exposed toward this bank and this create also additional uncertainty. Just to provide an example, in
2008 with the failure of Leman’s, in Banca d’Italia but also in the other supervision authority, in
order to understand for any banks their exposure to Lehman was matter of weeks, not days simple
because the level of interconnection was very integrated and the idea that banks could fail was
perceived as remote. Nowadays this has changed but nevertheless direct interconnections are still
at the core of the problem.
In addition there may be also Fire sales Effect: basically a bank liquidation means that a lot of assets
are going to be sold in the market and market selling is downward pressure for prices, and prices
affect the market value of the balance sheet of other banks. So, there is direct interconnections
among banks.
All this thing means that the financial system is less capable to serve the economy in term of
financing and this affect final firms, the ones that produce on the real economy for the consequence
for the real economy. If you think to the sovereign crisis of 2012 we had a macroeconomic shock
that triggered uncertainty to the European framework, this uncertainty had a negative spillover on
the sovereign peripheral countries, increased spread affecting banks, banks were less able to serve
the real economy and that’s why there was the downturn effect at that time.
(Grafico Possible Approaches) banking sector are very much regulated and we have several tools.
We may control the composition of management of a bank, try to regulate requirement of managers
and shareholders. We may have restrictions on competition: all the banking activity is subjected to
banking licenses, so we can restrict the competition. We can also impose some restriction on the
typologies on the assets that can be done, think about the discussion on the distinction among
commercial and investment banking activities, while we are avoiding that collection of resources
from the retail can finance investment banking activity.
We also ask for disclosure requirements: there are several info that we receive from the bank to
send to the public, because the more information the bank provide, the less information
asymmetries there are. Finally and most importantly we ask a bank to keep a skin in the game:
whenever there is a loss, the first portion that is ate is the Capital of the bank and there is also a
minimum capital requirement that any bank need to hold. We ask capital that is proportioned to
the risk in an institution. The greater is the capital requested, the less is the probability that a loss
in the banking assets will translate into loss for depositors that are debt holders for banks.
Supervision change country by country, but in any advanced economy banking supervision means a
mix of all these instruments (bank chartering, disclosure requirements, restriction on asset holding
and activities, restriction on competition and capital requirements) all working in capital with joint
effort.
(Slide After the 2008 financial crisis) Supervision has changed a lot. We went after 2008 but this
progressively increased, we went to light touch approach to more intrusive one
Light touch: basically have the bank few fundamental principles, led the bank to autoregulate by
itself and try to intervene only where there were anomalies, but in 2008 we realized that this was
not the best approach and some more intrusive was necessary. Nowadays we have more interaction
with the board members, there is regular discussion and physical attendance to the board meetings
which is something that was not used to happen in the past.
We have fitter and proper assessment for any senior manager that want to join the bank, if one
bank want to change its CEO it has to ask permission to the supervision authority and they run
interviews to that people before running a permission. Supervision try to avoid that banks become
let’s say “one man show” that are dominated by important managers but also the transition
between managers goes smoothly without major discontinuities, and this is expressed by banks
Succession Plans. There is an increased stress toward Risk Appetite and Culture, so increased
regulation on this way but also increased importance about risk management. One lesson we
learned from financial crisis was that whenever bank used to lose large losses probably the probably
the role of the Risk Credit Officer was not put at the same level of the other Executive Managers,
it’s a possible source of losses, lack of risk culture in general. Also we learned that basically not even
model and risk management user never than perfect and there is also kind of endogeneity in the
process, in fact as the time passed by new risks emerged and this is not sure whether the tools that
you have in place are fully capable of capturing those risks.
That’s why we have this Follow the Money approach, rather than following the risk exposure we try
to understand better what are the main sources of profitability for an institution and starting from
the profits to arrive to a better understanding of what can be wrong in that respect. We also try to
be more for looking rather than backward looking in our management. And in this regard Stress
Testing played an important role.
I have also to say that it changed a little bit the paradigm since the negative externality that I
mentioned so far are now perceived as more important on top of the Microprudential supervision
there was also ad additional layer of supervision named Macroprudential that is aimed to identify
the general tendency in the market like for example real state bubbles or maybe general practices
in the industry like for example effects lending or overheating of the economy, so excessive lending
behavior, that from the point of view of one single institution cannot be detective, but on the other
side can really generate a systemic risk for the entire system. And now we have these two layers
that are working in parallel.
*Domanda: so there is first Micro supervision and then Macro supervision that captures the
externalities of the bank?
Risposta: They are two sides of the same coin, basically I work in Macro supervision arena, so I really
follow the behavior of one single institution. On top of this there are other guys that analyze the
market on a broader perspective identifying general trends in the market or in the macroeconomics
that cannot be so much visible at institution level but that may affect the risk for all the banking
system. Two prominent examples: real estate, the subprime crisis in the US, there was for sure
inflation and real estate bubble due to high price in the real estate market, people that used this
higher prices on the real estate market in order to get additional lending, once the market changed
its direction, there was a downturn in real estate prices, and this prices translate into losses on the
debt and those losses triggered for a major crisis in the banking sector. So, identify general
macroeconomic trends it’s also a good way to identify potential treats that may not be seen in
isolation when you look at each single bank, but in a broader perspective it may be a real danger.
Another issue that we have is this excess lending that is very typical in other jurisdictions, that you
lend to retail clients in another currency, of course shock in the domestic currency may have adverse
consequences in the quality of the portfolio.
(Slide Actors Involved) Banking supervision in Europe is real complex framework. We have:
The Basel Committee: it is the international standard setter, the set the standards for banking
supervision across the globe. Those standards are generally translated into regulation at European
level by European parliament and Commission. Most of the banking regulation framework is
written by a European regulation that are binding in any EU country. In doing this the European
Commission Is supported by the European Banking Authority which is a public agency that works
mostly as a standard setter, they analyze and provide standard s that advices and are translated into
regulation of the European Commission. On top of this first layer that is basically the banking
regulation, there are the Supervisors:
The National Competent Authorities, like Banca d’Italia, that act as supervisors. Any National
authority is part of the greater Single Supervisory Mechanism which is coordinated by the ECB with
the participation of all the national competent authorities that for all the countries that have the
euro but also in some cases beyond those countries.
(Slide Basel Committee) So, Basel committee was set up in 1974 and banking regulation started early
in 1988 when we had the first generalized principle of banking regulation. We started early because
this idea of negative externality across the system was already an important pillar at that time and
there was also the idea to try to insure a level plain field, some minimum degree of coordination
among the different regulation among the globe. In Basel there are almost all the countries
represented, then there were several enhancements in the regulation. Basel III is still an ongoing
work; it was launched initially in 2010 but it is still ongoing. I can also highlight that also the industry
generalized the regulation until 2004 and starting from 2008 after the Lehman collapse the general
tendency was reversed. We entered in much more restrictive regulation toward banks. Now the
number of rules is increasing, you need generalized framework because level plain field plays a role
of key component in the industry. If within banks there is one institution that has competitive
advantage because there is softer regulation in that respect, then may be dangerous for the entire
system because competitive pressure may inject wrong mechanism into the machine.
(Slide European Banking Authority) So, European Banking Authority was set up in 2011 in the
aftermath of the financial crisis, so in 2011 we were in a position in which there was this general
Basel principle but they were declined in very much different way across the EU countries. They
were feeling the need of having a common single rulebook, a single regulation for all the EU
countries and with this purpose was set up the EBA that is part of the system of financial supervision.
There are 3 authorities: the EBA, the European Security Market Authority and EIOPA which is
basically for the pension fund authority. The ESMA is like CONSOB standard setter. Together with
this the system also comprises the European Systemic Risk Board together with all the national
competent authorities. The ESRB is the one that coordinates the effort in Europe against macro
prudential issues. The main task for the EBA come up with European single rulebook but also try to
analyze the market and assess the risk and vulnerabilities in the EU banking sector. And one way
they do it is by Stress Testing
(Slide Banking Union) On top of this there is the Banking Union. The banking union was set in 2014
when there was the idea that pension on the sovereign market was perceived as an element of
fragility with regards to the banking system. This need of try to brake the connection between
sovereign risk and banking system, from one side if a bank fail it is a problem for the state that
cannot step in and clean the issue by putting financial resources in order to resolve the issue, on the
other side the most prominent investment of a bank is sovereign holdings. So, an increase of prices
in sovereign holdings means a loss in the inventories of sovereign bank that any bank has in its
balance sheet, and this loss means also less money to finance the economy. On top of this the
Banking Union was meant also for structural reasons, there was this idea that we should avoid the
fragmentation of financial market, so this idea that the liquidity condition, the financial market
working in different ways across different area was perceived in a way as a kind of an element
fragility and inefficiency in the transmission of the monetary policy and there was the idea of avoid
Ring Fencing practices.
(Slide Fully fledged banking union) There is this kind of statement that there were banks that were
European in life, very much integrated banking system across Europe with several final European
institution but when this institutions went under crisis then the problem was most of national
competence authority in which also large banks groups was sitting and maybe losses affected
several other countries, so we want to avoid this anomalies into the system. Also trying to avoid
that supervisors act in the pure interest of the local jurisdiction where they sit. Here there is fully
fledged banking union composed of three elements: from one side there is the Single Supervisory
Mechanism which means that all the institution in Euro area are supervised by one single institution.
There is a Single Resolution Mechanism, so this idea that all banking crisis is managed by one
institution with the aim of avoiding that a bank failure translates into losses for the tax payer.
It was also included a Single Deposit Guarantee Scheme: this idea that the deposit should be
guaranteed by an insurance against losses for deposit up to 100.000 euros should be given centrally
and given all the bank, and this last pillar is not yet in place, still one major pitfall of banking union.
(Slide SSM in practice) We have in the ECB the decisional body that is the supervisory board plus the
governing council, then we have all the operations for the supervision that is done toward for
director general, the first two are line supervision, the other two are horizontal function , the first
one, the DGIII works for the supervision of smaller institutions, the DGIV is placed to support the
line supervision with internal model experts, policy, legal, consulting and the rest. The core parte is
done by the Joint Supervisory Teams that are teams that a re made up of people from the National
Central Authority and people from the ECB working together in the same team supervising the bank.
Just to give an example I am in the JST of UniCredit, I have two bosses, one in Banca d’italia and one
in ECB, the last one is the head of JST, I am the head of risk teams, physically at the supervision of
internal model of UniCredit and I have 10 people working with me, they come from ECB and Banca
d’Italia of from the other NCA that are in the supervision of UniCredit.
(Slide Banking supervision in a nutshell) What we do in supervision is real a nutshell. We need to
understand better the risk on an institution so we constantly monitor the risk that a bank incurs,
that can be market, credit, operational, liquidity risk, and against these risk we check the adequacy
that a safeguard have in place, but the safeguard are first of all in the space of the governance and
banks need to be well managed first of all. Then we have the Internal Control System, both 3 layers
on ICS: line controls, risk management adequacy, internal audit activities. Then we try to investigate
the sustainability of the business model of the institution and finally the adequacy of their financial
resources, how much capital bank has if this capital is appropriate. It is really an issue of balancing
this issue. We need to find the quality of the safeguard that a bank has in place (governance, risk
management, profitability, and capital) and that are adequate to the risk that they run. So, to larger
and more complex banks we ask more, if a bank is a community bank, we accept also something
less sophisticated. Everything need to be proportional and the real challenge supervision is really to
try to find these balances.
(Slide Basel II) Standard supervision ex Basel II is made up by three Pillars:
1) What we ask to the bank is a Minimum Capital Requirement: we try to be risk sensitive, if a
bank is subject to more risk, we ask more capital. This pillar insists on a predefined scheme,
there is a restricted number of rules that explain and regulate this area and tells how much
capital the bank need to have. Not necessarily this minimum capital requirement captures
the risk that a bank may incur. Against this we have this second Pillar.
2) Supervisory Review: we ask to the bank to have a good risk management capability, trying
to assess all the relevant risks they may incur when they do the business. Try to measure the
economic capital, the financial resources they need to hold this kind of risks and a supervisor
must review all this process on an ongoing basis.
3) We ask also for Market Discipline: If investors get the information, they can process it and
they can also support supervisors to supervise the bank and exercise their pressure in the
right dimension. This pillar says that we ask support to other actors that still invest into the
bank, may incur risk and the can facilitate good behavior from institutions.
(Slide Standard operations for a bank) We can see a standard balance sheet of a standard bank. They
collect side deposits, that are liabilities and invest this side deposits as assets, and they put equity.
(parte destra del grafico ) In our respect they have positive interests from loans and passive interests
from deposits, the difference is the Net Interest Income that they get. Of curse there are operating
expenses, that is the cost of having branches, IT system, risk management, and so we have Operating
Profits. This Operating Profits are a stochastic variable and may change as a function of the risk that
a bank may face, and there are several risks in that context.
(Slide dopo) First of all we have pure Credit Risk: there may be a deterioration of the credit quality
of the portfolio. if some of my customers loans defaults, they fail to pay and I have less interests
from loans and I have losses in the principal of my investments, reducing the operational profits.
Liquidity Risk: customers loans are term loans and there is an implicit maturity transformation in my
job and of course if there are depositors that ask back money, I have to monetize the customer loans
in order to give them money back.
Interest rate risk: the interest from loans and the passive interest are a function of current interest
rates, I may have different elasticity with respect to interest rates and this may generate volatility
of my net interest income.
Operational Risk: associated to failure or mistakes in the business operation. If you think a standard
bank, the only thing that is not stochastic are the costs. You have basic operating expenses that you
need to afford while all revenues are a stochastic variable, are subject to risk.
This scheme was in the mood in the ’70-’80 when there was this standard 3-6-3 model. The idea
that you take deposit at an interest of 3%, lend at 6%, gain the 3% difference so, you can gain easily
with this model. Nowadays this model is not possible because there is much more competition
among banks.
(Slide SR for any commercial bank: Credit Risk) This risk is made up: we have customer loans, if one
default occurs the quality of my asset deteriorate, some asset becomes NPL, if I have NPL I have to
unpair the value of this asset and I have to set up provisions which are best estimate of expected
loss that I can suffer from this unpaired loans. Even my income statement can start to get more
complicated because I have this Loan Loss Provision and I need to be sure that at least the Net
Interest Income net of Operating Expenses is able to cover the Loan Loss Provision If I want to make
profits.
If I want to measure the risk of having default and the loss that I can have with a default, probably
I need to estimate the following parameters: the Probability of Default for a customer loan and one
it got default, how much I’m going to lose in % time, the Loss Given Default and what is my Exposure
at Default. The PD, LGD and EAD are at the core of KPMG course.
(Slide Credit Risk: Complications) When you talk about standard loans, in the banking industry it
means several things: if we have a wide range typology of credit risk, this list is not exhaustive, with
some shade of implicit risk. There may be some things riskier for example Leverage lending,
providing protection for a securitization, while there can be other activities less risky, for example
the Security Lending Business, that may be less risky. What is the key leverage for controlling credit
risk, for sure short term position are less risky than long term maturities, because you can better
forecast the evolution of credit worthiness of your client. For sure if you lend against collateral
maybe is better than lending unsecured. For sure if you have a collateral that is priced on the market
than you can better control this risk rather than other collateral as real estate that may be more
difficult to evaluate. For sure personal guarantees can play a role. For sure also the leverage, the
implicit leverage that I have. In addition, also concentration is also another relevant dimension of
risk. With this I want to say that we have these 3 parameters, PD, LGD and EAD, that are a kind of
summary of all this complexity. The LGD is a proxy of the level of protection that I get from collateral.
The PD may be a proxy of the quality of my counterparty.
(Slide Credit Risk Measures) This is was I was explaining so far. We have at a single position level
those 3 parameters, than those parameters can be used for the calculation of the Expected Loss,
but it is a good risk measure at a single position level, if I want to measure my credit risk at the
portfolio level, then I need to understand the concentration of the portfolio and the default
correlation between my counterparties and I may try to have a joint loss distribution and also an
expected loss.

So far, we basically discuss for credit risk perspective it is not the only risk that banks face an another
standard risk that banks face is liquidity risk that we have already explained the reason why this risk
affect the structure of any bank business, we have deposits and we know that we can always go to
the bank and ask for a draw of deposits, while loans tend to be more long term but in order to
finance the economy you need them both. In particular the loans are not so much liquid in fact you
cannot easily monetize them or in a fast way, maybe in this respect this kind of maturity
transformation is really a structure feature of any banking business, in most of the cases banks and
most sophisticated institution also tend to increase the leverage implicit in their business and this is
typically of this additional component. Instead of investing in deposits, they also collect money from
institutional bank on the financial market and use those money to fund their business. What is the
main safeguard against liquidity risk? In the short term the main safeguard is given by cash and
HQLA, HQLA is an acronym that stands for high quality liquidity assets, the idea that you need to
have a liquidity reserve (money or something that you can liquidate very easily, monetize overnight)
in order to fulfill possible shocks that you have on the liability side, so you may have a kind of deposit
loss, you may have losses in general or some additional costs that represent all source of liquidity
risk and in order to protect yourself you need to have liquidity resources, so basically the few things
about safeguard, for the banking business, this cash and the HQLA is the same as equity in the credit
risk, with equity I can cover loss from my non-performing loan on portfolio, from the cash and HQLA
I can manage some compression of my balance sheet due to liquidity households, which is a certain
reduction of my liability that force you to monetize a part of assets.
One important thing to be back again about credit risk: if you have a downturn in the economy the
portfolio of your NPL starts to grow and it becomes that a significant fraction of your resources is
invested for the loans, in this scheme, it is like the equity is basically financing mostly entirely the
NPL, that is like the size of our equity. This is perceived as a real issue by banks because part of your
resources are distorted by the real nature of financing the real economy while they are funding the
position that are in recovery and that are not productive and especially you cannot consider that
resources as income producing, so once you have crises like in the very few years for the Italian
banking system, this is problem because you are using your resources in a biased way, not to finance
the real economy but really to finance expected future recovery, in activities that are already
defined. This problem was so big that basically European institution and all the regulators put in
pale the dedicated action in order to provide banks some incentives in order to provide stocks to
their non-performing loans. They could set strict incentive in order to reduce their stock activity to
better financing the real economy. Liquidity risk nowadays is really important in this covid-19
scenario: the real economy is frozen from repayment from loans and so they are subject to
moratoria, they are delayed payments that means delay from the asset side and by the liabilities
are still after running and all the company are tried, if they have committed lines they are trying to
grow that commitment lines in order to get extra liquidity, third force there are a number of new
financing given as form of new guarantee, as public guarantees will bring an increase of the asset
side as well. So, there are 3 driving forces that are on the liability equilibrium of any institution,
that’s why we are paying a significant attention to this profile.
Also there are different angles at which you can look liquidity risk, two mainly components: market
liquidity risk and funding liquidity risk, if you go there funding liquidity risk is a shock that you have
on your liability side, that maybe there are depositors that withdraw their deposits or reduce the
funding that you have from your activity. From the asset side you may run problems about the
quality of your assets and you are not able to monetize them or you have a shock of HQLA. Name a
typical HQLA is Italian soverate bond, when you have an increase in spread on Italian soverate bond
the price of those instruments reduces and liquidity in this market narrows and so the ability of
selling those products and gets funding is lower and so this is an additional risk.
In order to have a liquidity crisis you need to have both 2 phenomena in the phase because basically
think about you have long term asset and short term liabilities, if you have a problem of funding like
a shock for example, as long as your asset are liquid and they can be monetized in the market you
can still repay back those liabilities, if you cannot monetize then you of course incur into a problem.
So, two sides of the issue but you need both of elements to have a pressure, it was what happened
during March: a sudden reduction in prices in the financial markets, difficulties into monetizing
assets and so then increase liquidity demand to the frozen of the economy, and the ECB plays a
central significant role and it tried to off set the pressure on the market and it was successful. On
liquidity, this is basically the traditional banks that we saw before but actually they are much more
sophisticated and in general there is also a bit of shadow banking components: they try to use
veicola to draw maturity compensation, in general there are some business that tries to translate
veicola, commercial paper into structural funding, it was something of maturity transformation. In
the standard traditional bank environment the deposit (asset) may affect your credit line, in more
complex institutions the possible source of liquidity risks are multiplied, you may have problems
also in the shorts term, you may have problem also in the long term by not issuing bonds. This is
what happened to money markets funds, they fall because they wanted liquidity and so if the
instruments are not liquid anymore because of the market frozen then this conversion is not
possible. One thing to know is that in general liquidity reserve are not done by cash in banks but
they are made of HQLA, I use this ones in terms of liquidity reserve and I use the market in order to
liquidate them in Repo market in the safest possible way and if it is very safe, it is perceived as
money. Cash cannot be used as payments, for banks it has to be in the network because it has to be
given in case of demand of a client otherwise you go in bank run. So, Repo is the standard way in
which banks manage liquidity risks, in the short term they tend to have liquidity in the securities,
those securities can be done as a collateral in order to fulfill the liquidity demand, while other banks
that borrow money from re-selled Repo market they get securities that can be go another client.
Repo liquid market is a good institute for banks, during crises it becomes suddenly illiquid in
particular during 2008 you can sell the assets with the haircut, it tells you how big the %value of the
asset that you can really monetize, in 2007 it was 0 and then 2008 you can get just a bit of your
asset.

*What kind of underlying assets? *


when you say HQLA, it’s really high quality liquid asset, this is something that might change, it is
subject to risk like nowadays like Italian soverato bonds, during crises we have that assets were safe
from credit risk perspective, the capability of Repo market is really variables, everything about the
short term is accepted as collateral you can say that ECB has a role, long term re-financing
operations, one possibility is that they set fixed rules for eligibilities collateral, if the collateral is ECB
eligible you can always monetize it by going there, otherwise you cannot even if the market falls.
Repo is an overnight market; I lend today something that I need in a very short time not at the
maturity. ECB thins about the commercial paper during covid-19 with very strict liquidity rules
about eligibility.
If the haircut increases and I’m the security holder I can extrapolate less money first because the
market on my asset goes down and also because people are asking me for greater haircut I get less,
down warding prices.
If you think about the down grade of the Italian soverato bonds are considered as money, this is not
granted anymore in case of crisis, also the intention of institutes into having it’s very reduced,
investing in this security is reduced. 3 typical risk of any banks is the interest rate risk, that is very
similar to liquidity, it is related to maturity transformation, but it is also related to the pricing of your
portfolio.
So, in the asset side you have cash that are mostly invested at variables rate, overnight rate, HQLA
that are securities that are at fixed rate then we have loans that can be fixed or variables so they
are a mix and then you have NPL that are fixed by definition. We have side deposits that are
overnight, their behavior is retouched by this environment and you have other funding sources with
different features. If the interest rate rises or goes down your loans portfolio and your liabilities
portfolio may rise with different intensity and it can change volatility in the net interest rate. The
change in interest rate can refer to different elasticity form passive interest and positive interest
from loans and the net interest income is a stochastic variable that changes with the time and with
the interest rate itself and it is very important to any bank. Even with this respect we may have
different perspective over the interest rate risk, we can focus on the volatility but also on the interest
income.
Net interest income is the driving part for the banks, operating expenses that tend to be fixed cost
(not variable) while the operating income and the net interest income is a function of interest rate,
if you want to stabilize your profit then you need to stabilize your net interest income with respect
to the interest rate or maybe you can have an economic perspective, you may think about the equity
the economic capital that you have is the difference between the market value of the asset and the
market value of the liabilities. Rates are very important for market value, it’s a change and a key
element in the discount given your future cash flows and so if the change in the value in the asset is
not in line with the change in the value of the liabilities, then if interest rate changes than you have
a change in the equity and this is an additional way in which you look at this risk. Both the
perspective are important because if I am a banker in a growing concern scenario and then I need
to stabilize my profit while if I am a supervisor then I need to maximize the residual value of asset
and liabilities, indeed bankers focus more on the volatility of net interest income while I’m more
focus on volatility of economic value of the asset.

Capital and earnings are both so much important: graphic NII/Avg Assets Evolution, the
projection that the interest rates that I can get in the future, so just a certain period. For the initial
part we have the sharp reduction of interest rate and then you have all the banks, so a reduction in
the earnings associated to their asset, but there is also a lo do volatility, there were banks also more
and less affected. Maybe there are several elements, different strategies regarding the interest rate
they are kind of survival in a broken scenario, it’s a key element for the profitability.
Another important graph is the one ‘Sight and Savings Deposits’ that is related to sight and savings
deposits, sight deposits are really the franchise to any banks, so they are a huge part of any bank
liabilities and so it means a lot of costs, the difference of market rate and the client rate, before the
2008 the distance was very big, you were able to borrow at 1% and lend around 5%, there was a lot
of gap. What happened?
After 2008, you have a Sharpe decline in the rates and so the margin sharply declined and it was
compressed by the market crisis and it was no related to credit risk, it’s like there was not any credit
risk associated to that. So, this compression means that the franchising is implicit in the bank
industry and so this is probably the first source of difficulty of commercial banks nowadays they are
not really able to monetize this franchise, it’s the reason why in 2008 other people want to buy here
in Italy. So, negative rate is one of the main issues for banks, prolonged low interest rate as well and
maybe also more difficult to hold.
Operational risks:
is the risk of losses that may results from internal mistakes, inadequate operations and others,
mistakes in general that affect your balance sheet. Cyber and IT risk are very relevant problems
because there is a huge process of digitalization, so there are new risks that are emerging from this
point of view and investing in this sector might be very important. Environment, today people is
much more focused on green economy, trying to reduce pollution. During this covid 19 the main
discussion is all about the operational risks, I want to provide few examples: if you think about Banca
d’italia, it’s working in a remote way 90%, you should do the most part of the work at home, an
important fraction of employees has never been tested, so in a remote setting you need to stress a
bit because banks cannot make any mistakes and they have to change the way they interact with
the clients because you cannot meet them in person. We are in a contest in which we are really
away from the routine and so you need to pay more attention.
Trading actives you need to do the market to market, a lot of controls like social media etc. do not
work very easy in this setting. A lot of operational risk are related also to the data, of you lose data
you can also have problems with the clients and etc.
We discussed about the standard risks about standard commercial banks, banks are not universal
they provide a wider range of services and so they are all different in this way. Banks are market
makers they access to primary market, if you want to go public you are related to banks that deal
with that and there are also more of them that deals with the private part. There are also a wider
range of risks: in particular more risks especially for the banks that are more concentrated and
larger.

Universal banks:
the standard balance sheet of a universal banks and the standard operating income, you gave
trading activities, committed and uncommitted lines from problems. For the operation income you
need to consider that it is affected by any source of risks, the only thing that makes the difference
is the operating profit that is the one more affected by risks. We ask for minimum equity, but what
does it means?
We ask for the minimum amount of capital that any institution has to carry out, you should
remember one number that is 12,5 it’s a kind of leverage business, in a small fraction of equity, take
money by borrowing and then invest than, so the ratio between the asset and the leverage to me
has to be 12,5, so for each euro invested in equity I can have 12,5 euro of asset, putting in other
terms, for each euro that I have in the asset I have 8% of equity, so the 8% is the key element for
the minimum capital requirement (1/12,5) for a banks. Of course, It’s not calculated on the nominal
amount of risk, so for example if you have a customer loans and I have a mortgage, if you have a
hedge fund, I’ll give you a different weight to each typology I’m considering. This is basically the
standard rules for capital requirement, I always ask that the capital with the comparison to the risks
it’s above the minimum set of regulation and it’s about 8%, I do not consider all the risks but just a
subset of it, credit risk, market risks, operational risk and concentration risks, these are the risks that
for me can generate problems, they must be lower than the capital required for banks.
Basically it’s just the same story, we have the distinction the trading book (items with trading
intention) and the banking book, the trading one is subjected to requirements while the banking
book is related to credit risk requirements, the purpose is to have enough equity capital, sufficient
beyond the minimum to offset the risk that rise from the other elements.
So, another important element is that basically: how calculate the risk weight? they can be
calculated in two different way: by a standardize method or by using internal models that leave to
the bank the opportunity to define the internal element and this applies both to market and
operational risks. For little institutions can use both the method, while more sophisticated parts
focus on the internal model. Minimum capital requirement is the sum of 3 VAR: var is a valuate risk,
it tries to measure the maximum loss that a bank can suffer in a certain amount of time.
so in the standard idea that the minimum capital, the amount of equity must be equal to the
maximum loss that I can suffer from all my activities, so it has to be sufficient to cover all the possible
losses with 99,9% confidence, only 0,01% chance of default in one year horizon, this measure is itself
an estimate and so it represents a very easy estimate that is related to the assumption that we
define before. I assume that I have a maximum loss from credit risk, market risks and operational
risk all together simultaneously, there is a very important confidence interval, but there are other
things on the other side that the model does not capture, or there are my mistakes in programming
it, I tend to have even more default. The definition of the variate risk is that the capital has to be
sufficient to cover any loss that I expect to have in that confidence interval, this is the main
regulatory element that is set.
So basically, we just said that capital requirement rely on the capital risk components and of course
variate risk has some critics, there are those 4 areas: Tail Risk, Prociclality, Complexity,
Arbitrageability.

1. Tail risk: if you think about it, standard variate risk, if they can tell you that it’s a quantile estimation,
if things go wrong, you can basically understand that you can define a quantile that tells you nothing
about the shape of a curve, the tail can have both a good or a bad behavior, the certainty of the fact
of having a loss is much bigger than the one that I have estimated in my capital, and we already know
that credit risks are not normal and so the tails are fat. Massive concentration that may change the
rule so in a safer instrument, it can have a lower probability but it can have a massive in the tail. In
market risk I can define the tail because I can define the distribution and so It’s me who decide the
tails or not.

2. Prociclality: The problem about the risk of capital requirement is that there is a source of prociclality
and so in general all the expansions of the assets in the banks were accompanied to the increase of
total liabilities, so this means that in a market the banks tend to have higher leverage and capital
grows in a less proportional way and of course the capital is in level with the sources of risk and so
when things go wrong the situation is much more problematic. Several years of good quality and
growing path, by the increase of a lot of assets and so liabilities, it’s almost impossible that crisis
becomes impossible to hold.

3. Complexity: Basically, if you think about a balance sheet, minimum capital requirement are a kind
of risk weighting the asset as a function of implicit risk. More standard measure is the leverage:
consider asset as a multiplier of the equity. By reducing the implicit risk, you can even more double
your leverage, but you cannot go the opposite. In the paper of Haldane, the risk capital measure ratio
was not a predictor of default and so even banks with quite high capital ratio failed. This is a kind of
standard perception and the reason why you know basically we have risk capital ratio is to avoid the
incentive of banks in investing in riskier position and it becomes higher.

There are also a number of advantages: even the fact that we try to discuss with the banks about
the main ingredients of credit risk, this gives us additional perspective about the bank in general,
and it gives us a better understanding about the portfolio, it allows you to do more important
decisions.
Calculating Capital Ratios
As regard the capital we may have different typology of capital and so they are in function of the
loss capacity of instruments and so they are also a definition of common equity of the banks. In
general, there are also additional instruments required by banks such as: fixed bonds, subordinated
capital. There are different typologies of ratios: we calculated the common equity to one ratio
related to common equity over total NWA but we have also Tier 1 ratio and Total capital ratio. There
are a number of items that must be deduct from common equity calculation and it is very technical
to calculate.
The 3 main blocks are: credit market and operational risk.
Credit applies for bank boom items: loans and receivables, guarantees and they are subject to
capital requirement and there are 2 basic meteorology: standardized and internal model, internal
model the banks have to estimate PD, LGD, EAD. You have an outcome that is about the risk weight
calculation. For the standardized we have an accounting values that are multiplied by risk credit
factors weight that are different from others, so the exposure of balance sheet is different because
of the fact that it is multiplied by elements that are risk weights. Default exposure is calculated by
balance sheet element, if it is below 20% then they go on 150. Internal rate based approach and so
in any case the regulated provides a formula that must be applied and the standard approach plus
this one try to define the amount of maximum loss that you can have under 99,9 of probability and
so by comparing them both graphically you can observe that both the IRB is more sensitive while
the standardized is less and so there is an implicit premium and so you tend to have a better
treatment and if you go in a very good quality you get a higher capital requirement in case of IRB,
while the standardized is still flat
Lezione 14/05
Risk is related to variability in the value of financial instruments. Every time you
talk about risk (in particular about credit risk) you are going to have two different
concepts: 1) expected component of risk; 2) unexpected component of risk. What
do financial institutions do when they manage risk? They try to absorb, mitigate,
diversify, and transfer risk. The idea is that risk generates performance volatility
that have negative impact on the value of financial institution. For this reason,
we try to measure this risk in order to monitor and mitigate the risk itself.
We just said that we have two different components of risk. The idea is that we
have some fluctuations in market value (ex. Share price going high and below a
certain threshold) that we can split into an expected component and into an
unexpected component. We measure the two different components in two
different ways. The same reasoning can be made when thinking about the number
of loan defaults and credit rating downgrades as well as the number of processing
errors: both categories can be divided in two components, an expected
component and an unexpected one which can be computed with different
methods.
When we go through the computation of the economic capital, we can divide
such distribution in 3 parts: 1) expected loss, the bank knows that it’s going to
losing some amount of capital because some people are going to default, some
credit obligations are not going to be met and for this reason we have a standard
risk cost to cover such losses; 2) unexpected loss component, losses that banks
do not expect and then it’s going to compute the economic capital taking into
account the losses on this part of the distribution; 3) then there are the catastrophic
losses, for which there is no capital coverage in the estimation of the capital for
the bank. Expected loss is important for pricing because we need to cover the
expected loss + the risk premium on your capital. When we refer to pricing, it
means when a borrower pays for a mortgage, he pays an interest, that interest is
based on an estimation of the riskiness of its mortgage and that part of the interest
is going to cover the expected component of the risk.
The Basil Committee was established for the first time at the end of 1974 because
of a shock in banking market and also in the currency market which lead to a
failure of a number of banks in the west Germany. The committee tends to
enhance the stability of banking system issuing the regulatory requirements
which are reflected in guidelines and standards that banks must apply in order to
evaluate risk. Before the Basel accord, banks were independent in the risk
management and after they switched from a structural approach to a prudential
approach with Basel 1 which introduced capital requirements and also a
Standardized approach. With Basel 2 we have also the inclusion of internal
models so that banks can compute regulatory capital using internal models.
1988: first accord of Basel because of Latin American debt crisis and introduction
of solvency ratio and internal model are not allowed.
2004: Basel 2 Committee decided to replace the first accord with the introduction
of three pillars, of Rating for the calculation of capital requirements for credit
risks (IRBA) and implementation of new technologies in the market.
2010: Basel 3 response of financial crisis of 2007-2009 during which banks had
too much leverage. So the Committee decided to raise capital and liquidity
requirements. Reforms of Basel 3 have been finalized in 2017 with Basel 4 and
in this context the Committee decided to reduce the variability in the calculation
of RWAs.
Regulators asks bank to set capital resources according to their own riskiness
degree. There are two main reasons determining such constraints: 1) in case bank
faces higher losses it would be able to rely on this provisional resources; 2) banks
may not find convenient to assume too much risk since they would set an higher
capital.
Capital requirements represents one of the first pillars of Basel 2 that banks
should satisfy. We are referring to several risk sources as: credit risk, market risk,
operational risk and counterparty risk. These requirements have pushed banks to
improve their activity to measure risk.
IRB Advanced: internal rating based advanced framework.
Regarding the second pillars, it’s about the supervisory review and we have two
main points of attention: 1) banks internal capital assessment; 2) efficient
regulatory supervision. To complete the second pillar, banks are required to
undertake internal capital adequacy process and it consists in designing and
implementing a risk adjustment framework. This framework must ensure that
banks constantly meet the capital requirements and manages risks that are beyond
the one already captured in pillar 1. This process, the ICAAP, need to be
approved by the board of the bank before being submitted to regulator front
review. Pillar 3 is about market discipline and it lists a set of disclosure
requirements which allows market participants to rely on more information about
the capital adequacy of institutions.
We have not to think at pillars as stand alone but we have also to think about
them as three integrated components which work together.
Credit risk: possibility of losses associated with decline in the credit quality of
borrowers. There are three methods to compute it: standard, FIRB, AIRB.
Market risk: it is related to the fluctuation of the price of financial instruments.
Counterparty risk: it belongs to credit risk since it is the risk that the counterparty
will not live up to its contractual obligations.
Operational risk: risk of losses resulting from inadequate or failed internal
processes, people and systems. We can think at operational risk as human error.
We have two different approaches: standardised approach and internal rating-
based approach (they can be two since there is the fundation and advanced
approach). The idea is that a bank can use the standardized approach in order to
estimate its credit risk, even if it can give a poor understanding of risks and less
understanding of the riskiness of assets itself. The IRB approach has a more
complex structure, but it gives an overall understanding of risk since a bank can
identify the risk associated with each position in a portfolio.
With the standard approach we just divide the whole credit portfolio into different
classes using a different segmentation and we apply different weighted
coefficients to each of the asset classes. With those coefficients we can identify
the riskiness of the counterparty and we can assess the credit rating of each of the
subdivision of our portfolio.
With the internal rating-based approach, we use an our own model through which
we estimate the risk parameters. Such parameters are: PD, LGD, EAD and M.
With Fundation IRB we just estimate the PD, while with Advanced IRB we
estimate all parameters.
The k required for any given loan only depends on the risk on that specific loan,
so it does not depend on the portfolio to which it is added to.
Default risk is the risk that a counterparty will not be able to fulfil contractually
agreed financial obligations due to its default.
Migration risk is the risk reflecting the potential reflecting the potential loss due
to changes in the fair value of credit exposures as a result of some borrowers’
rating transitions. A borrower that at time 1 is rated BBB and in the next period
he goes in default. This is related to migration risk.
Country Risk is the risk of default on any foreign debt repayment of principal
and/or interest owing to developments within a country that affect its
creditworthiness. We can refer at country risk thinking at spread between two
countries.
Concentration Risk in Credit Portfolios is the risk of suffering extreme losses
from an uneven distribution of exposures to counterparties, from contagion
effects between borrowers, or from sectoral concentration (industry,
geographical region, etc.). We can think at the case in which a bank has to decide
which counterparty to choose and the best thing bank can do is trying to diversify
its loans.
Residual Risk in Credit Risk Mitigations is the risk of the bank’s failure to realize
the financial worth of transactions intended to mitigate credit risk. It’s the typical
risk when a bank has to liquidate for instance one default position so for example
they have to sell the house granted for the mortgage and they don’t receive the
price they expected to receive when they build the loan.
There are different kinds of segmentation. Each credit risk segment can be used
as a basis for the model estimation. LGD for example can be estimated for large
corporate portfolio, foreign corporate portfolio and so on. So in order to
understand what we are doing with our model we need to make together this
segmentation with the four parameters we were talking about (PD, LGD, EAD
and M). Those segments are going to identify the underlying for our estimation.
The column credit risk segmentation is the one we refer to when we apply our
credit risk models.
Then there is the Basel segmentation, that is the one which the regulation refers
to different portfolios of the bank.
Then we have the commercial segmentation that is the one used for the pricing
of the assets and for the credit risk policy that bank applies to different portfolios.
We work on the first segmentation.
The definition of default is a key concept of Basel 2 and regulators in the last
years also set up new rules for the definition of default. EBA define default as a
situation in which an obligor doesn’t repay back a loan or a mortgage. The
position is named default when the borrower is not matching its credit obligation
since more than 90 days. For past due we need to violate some threshold that are
two: 1) absolute threshold, if the unpaid amount is higher than $100 for a retail
exposure or $500 for not retail exposure, we have the first signal of default; 2)
relative threshold, that is a ratio between unpaid amount over the entire exposure
a borrowers has. If both criteria are violated, a position is classified as default.
A position can be classified as default even without overcoming these two
criteria. In fact there are also subjective criteria (or unlikely to pay criteria) in
case a borrower decide to restructure a debt or a loan.
The bank can also experience a material threshold for example if the NPV of the
loan may decrease by more then 1%, a bank can classify such borrower as
unlikely to pay.
Of course the regulator also describes the rule for infection logic: if the borrower
is part of a group (cointestatario), the borrower itself can default as a group since
the default propagates to such position. So not only the group is in default, but
also the single obligor.
If a borrower is in default can also be cured coming back to a performing status
but such borrower is under observation for three months. If in these three months
the borrower doesn’t overcome the two materially thresholds, he can go back to
performing status.
Treatment of multiple defaults: if a borrower is in default, he can go back in
performing status, but it can also default again. In that case if the second default
event happened after 9 months after the end of the first default event, the
borrower is considered as default but the two events are independent each other.
In the opposite case, they are considered as one default only.
Banking regulation

Lesson 15/05

It is a combination of two things:

1. Some quantitative indications (how many days you are Past Due, for which amount you are Past
Due)
2. Some qualitative indications (if there is an indication that your situation has deteriorated such that
you are not able to repay in future).

These 3 elements: difficult to identify, quantitative criteria and qualitative criteria are the core of the
definition of default.

Since the beginnings Italian banks are classifying default in a different with respect to the one, we are we
are looking at right now, there is a European definition. W e were used to model probability of default with
slightly different way because the definition of default itself was different. Now the European authorities
are identifying a unique way to detect default to model all the events in the same way across Europe. All
the banks in this moment are trying to be compliant with the new definition of default because it is not
easy to switch from the previous definition to the new one because there is an IT effort, banks sometimes
does not have all data to be reconstructed the vents in the past. It is a crucial topic in this moment, all the
banks should apply the new definition of default from January 2021.

So, coming back to the slide (30), we would stress again the fact that there is a difference from performing
status and all other status which are defaulted status but for the Past due there is a counting of number of
days but you can also classified for other reasons, as subjective criteria in which, for example the credit
manager of the bank has evidence of unlikeliness to pay. You can also classify directly the unlikely to pay
criteria without passing for the past due. This differentiation is important for the modeling of the loss given
default. We have 3 parameters to calculate the RWA (PD parameter, the probability of default, the loss
given default parameter and IAD parameter, the exposure of default). Just for the LGD is better to
recognize all the 3 status because the Past due and the unlikely to pay, we recognize in that way, but they
are two status were the counterparties can be cured and can go back to the performing status. The
litigation cannot be cured, it is the last part of the process were the bank starts, for example for the
mortgage the legal process to sell the apartment, the collateral of the mortgage.

To summarize:

Performing status: we are not default

Past due/Unlikely to pay/litigation: we are default

As soon as we go from past due to unlikely to pay the situation goes worse and worse, when we are in the
litigation status, the loan cannot be cured, we cannot go back to performing. This is the crucial element to
model the loss given default. Why? Discussing about the probability of default model in the exercise we are
not taking this three cases (past due, unlikely to pay, litigation) as 3 different cases, we do not care about
the difference status of default because we are just trying to model when a position moves from
performing to default.

The obligors can be defaulted or not. The litigation, for example, is when a company goes to bankrupt. It is
a status in which the company does not have the possibility to pay back the loan to bank but also to survive
since it is the last part of the company’s life.

We must consider both the relative and then the absolute threshold and they must be broken
simultaneously. Just looking at the Past due, we can make an example:
Let us suppose a bank is loaning 200 euros to a counterparty and that the unpaid part of the loan is just 2
euros. If it is not paying the debt for more than 90 days, would you say it is in default? No, because we are
not about the absolute threshold.
How much should be the amount to be set as default position?
There are two elements: absolute threshold and relative threshold. The absolute threshold is let us say 100
euros and the relative threshold is a 5 percent. In case the loan is 1000 euros, how much should be the
unpaid part to be set as default? Remember that both must met to consider the person as default. The first
question is: Did I match the absolute threshold? At least 100 euros have be unpaid. Second: is the unpaid
amount 5% of the total loan?
If we check that the absolute value is greater than the 100euros, we are sure that also the relative
threshold is met in that case.
The definition should be harmonized amount all the institutions because it can vary a lot and it is not
unique for each bank and then you can model different events. All the bank should model and apply the
same definitions. What we should really understand is that as supervisor they have very strict identification
of default, the problem is that you classify in default position that are still alive, since position that are past
due or unlikely to pay in most cases are still doing business and this makes the measurement of the loss
given default (when you go in default) quite difficult to measure. Some additional complexity in the
estimation of your parameter and this is the complexity you must face when you go in estimating those
models.

Slide (32):
We can then have a look on the theoretical concept of the PD model estimation. The estimation of the PD
model can be summarized in 3 steps:

We have the modules development: probability of default can be explained by some risk driver or criteria
that represent the specific feature of the position which belong to the portfolio you want to estimate the
PD. Yesterday we said we can have different portfolio, for example, we can have a portfolio of more
corporate, business; retail portfolio and each of this portfolio can have different modules because not all
the feature are equal along the obligors.
The second part of the module development is about the development phase in which you want to
estimate the different models and to estimate the output of the model for each module.
We want to understand if a performing counterparty, that at the moment is paying the loan, if in the future
is going to default. This is what we are trying to do. When we say that different modules want to be
assessed is that there are different features, I cannot say that all the counterparties belonging to all
portfolios have the same characteristics. There are different characteristics that we can detect using risk
drivers. Let us split this problem in parties. Imagine we are working with a corporate portfolio; you are
giving loans to corporates. In this case is important to analyze the financial modules and want to assess
which are the financial drivers of those corporates. I can look at the total assets or the social demographic
of the counterparties, as the geographic position or the kind of counterparties. There are a lot of elements
to look at. There are also risk drivers that I can close in the module called internal behavioral. How is this
counterparty behaving with respect to bank? Is this counterparty paying all this loan? If no, how many of
this is not paying? So, we want to identify from different perspectives the behaviors that the counterparties
are taking with respect to the bank and we split these elements in different modules. Each of those
modules are gonna be single modules and are gonna estimate a logistic regression for each of those
modules to identify what is the final score (the output of the logistic regression) related to each of the
topics. Before the logistic regression you can also study that variable, you have different modules for
different futures. If you for example have a corporate portfolio, what is important is to study the financial
statement of the corporate. You can study the different variables, or you can combine the variables to
study, you can do it also from a statistical point of view. You can study the trend of the variable, the
correlation among the variables you have in your list. You start with a list of all the variables needed, which
can affect the risk of your portfolio and then step by step you select the best variables with some statistical
criteria. What could be a statistical criterion to select the variable? Thinking that you must choose among a
lot of variables, what can you do to discriminate among all of them? (to credit the risk). Not the variables,
but the major to do it. Imagine we have a lot of those indicators, all of them good. I have let us say 10
drivers, what can be the way I measure the performance of those variables to say which is the best
indicator? The statistical significance, as the T- stat, that is the univariate analysis, you run your variables
against the same dependent variable (the economic variable you also have in the development phase or
logistic regression) and you look at the p-value, the significance of your “beta”, if it is not significant you can
discard that variable.
We take all the variable belonging to the initial list, as 20 variables, and we test each of them running a
single regression for each of the indicators. The independent variable stated as (0,1) as default and non-
default and the indicator is one of the 20 variables I have in the initial indicator. I run 20 regressions and I
look at the results, as the statistical significance, so the p-value; the second is the performance of that
model that is a univariate model because it has just one independent variable against the target variable.
The first step is to run those univariate regression, to see which variables do not have an explanatory
power for your dependent variable and the second step is to see how the variable interact between them.
Normally in a model, the more variables you have more the model fit. This does not work with the PD
model; the number of variables cannot be that much high. You need to look at the one that mean
something economically speaking, not just the most powerful variable statistically speaking. The variable
should have a good trade off between statistical significance and its economic sense. When you have a lot
of variables you do some steps automatically and you could have some variable with any economic sense,
but at a certain time you look at them one by one, to understand them.
The dependent variable is a flag (1,0). In the sample you have just some snapshot or end of year, and you
look at your portfolio at that time but the variable does not reflect the status of that obligors at a given
date but reflect if the client will default over a period of time.
To refresh the concept:
I take a reference day, I put myself, for example, at December of a certain year and I look at the next year. I
have a window for my observation of one year. At the first day a look at all the counterparties are
performing, they are paying their loan, the I look at one year later asking how many of those counterparties
will default trying to model this flag taking the variable one year and the one year later, this is my target
variable and I try to estimate this probability of migrating from performing to default next year. We call this
window snapshot; we say that we want to model the PD over 5 years time horizon. I take the
counterparties in let us say 2012 in which they are performing, and I ask if they will default the next year.
The counterparties that will default are flag with 1 in 2013, I do the same with the next years. After the 5
years I have my development sample, the sample I am looking at to estimate and I have my default flag for
the 5 years. I do all of this for all the counterparties in the portfolio, then I want to predict this flag that I
defined by looking at different risk drivers. We have the target variable and we want to estimate it.
Making an easy example: If all of them are performing at the first year being all 0, the year after are all 0
except for 3 of them, so the default trade is 3% (3/10). Now I want to estimate this number, I look at all the
characteristics of all the counterparties ad I try to use those to predict the default. If I recognize that the 3
that went in default had the variable x that can take the value a or b was always classified as a. Looking at
the risk driver, I have the risk drivers x, with value a or a, that had value a when the counterparties has the
default flag equal to 1 and vice versa. In this case that variable is perfect, since is useful to predict the
probability of default.

Let us try to see a real-world example: take the excel file.

Let us say this is our reference date in which I put myself to identify where is my default flag next year.
In that state are the counterparties are performing, with default flag as 0. The I put myself one year
later and I look at the result on the default flag that could be either 0 or 1.
Now we want to look at the indicators, the risk driven we are looking at. Imagine we have two different
families of drivers: financial and behavioral model. In the financial model we take some variables as the
ratio of debt VS bank/turnover and the net income over the total assets. All the counterparties
belonging to my sample have this value showed in the column for that variable meaning that I can look
at the two ratios for all the counterparties belonging to our default. To understand if the driver chosen
are good, I run a logistic regression in which I put the default flag as target variable and the ratio
considered as independent variable. I can see what the estimate of the parameter of the logistic
regression in order is to analyze the coefficient and see if are significant, then if yes, it is a good driver
to estimate my probability of default.

Just doing a little focus on the variable selection and univariate analysis we have already talked about.
We can find this two issue in all models, also in LGD models and IAD models and the vary step of
variable selection in which you are making the univariate analysis is cross each of this one but the
construction of the developing the sample is different from the models, some of the step from which
you are finding the predictors are shared across the models.

What do you expect as the relation with the probability of default if you are talking about the earnings
of a company as a regressor? The lower the earning the higher the probability of default. Having that
type of data sample is suggesting us to analyze also other factors. If one year you have 100 earnings
and the following years 80 earning, that means are decreasing. With respect to this earning the
company is sustaining less cost, it is likely that the second year will be better than the first.
The same argument can be set also if you deal with the ratio of earning over total assets.
What do you expect if we are talking about the volatility of the cash flow of the corporate? (set as how
much they are changing in financial statement of the years). An increasing volatility will be negative
with respect to probability of default because a greater risk of a sharp drop in the cash flow next year if
the current year the corporate has a healthy cash flow. The higher the volatility, the less the stability of
the cash flow and so you can link stability with less risky corporate and higher volatility to riskier
corporate.

For the behavioral model we just select two different variables, one is the number of days in overdue.
We know this client are in performing status, but at the same time the number of days in overdue
could have been already counted. If this number is higher, is the client risky or not? It is riskier, more
the number in overdue, more the possibility to flag as default to the client. It is really a good indicator.
The behavioral model gives you just the behave of the clients according to the accounts, the mortgage
and all the lines in the bank. The second driver is the utilization rate, imagine I have a credit card with
1000 euros, but you can use just one part of that amount each month. If a client uses more than what
the bank granted to me, which will be the relationship between the utilization rate and the default
rate? The higher the utilization rate, the higher the probability of default. That is the reason we need
the variable with economic sense. The output of these 2 models is what is seen on the file, is the score.
The score is the output of the logit model, it is a probability of default transformation. The formula is in
the blue box.
We do it with the two variables for both model having the final scores that is the output of the logistic
regression. What we had at the end are 2 risk drivers. Then you can go through the second face in
which you have the output of all the modules, we want just to combine all this output in order to have
a unique score for each unique output for each unique counterparties. In order lo combine you use
again a logit regression and you put together the models, but just the scores that are the output of the
logit regression over you use the single risk drivers. In this way what you get is the integration score
that represents the sum of all the modules of all characteristics of the futures you used to model the
probability of default. The output is the integration score, as I have already said, and then you can use
the formula on the blue box to have the integration PD.
Repeating it:
Imagine you got the final score (output of the logit regression) from the two different modules. We run
a logit regression on the financial model and the same on the behavioral model and the get those two
final scores. Now we want to combine them into one model. We run a third regression in which the
indicators, the risk drivers, are not the same as before, but are the final score itself having a final value
called integrated score. The you transform this score in a PD value, finding the integrated PD. In the
formula the argument of the exponential is what we call score.
What is missing is the calibration face in which we want to reflect in the PD of the customers the long
run average PD, we can do it following 2 steps, in the first we have what is called the central tendency
that is nothing different from the long run average PD of the portfolio. Then you use a calibration
function to minimize the distance such that in your portfolio the probability of default is this central
tendency. The last step is to construct the rating class and we can do it by statistical method (as K-
means method) and you construct your rating class, you just assign each observation of your sample
the rating class which minimize the distance between the average value of the rating class and the
value of your PD (the best class has lower PD). I can map those PD on a scale to rating them.

Answering to the question in the blue box:


a) 0
b) 1
c) if the ratio goes up it means that the firm has less resources to repay the debt to the bank, so the
probability of default is high. The company has more short time debt, if this ratio is very high, the status
of the company is not good and has more probability of default
d) if this ratio is higher, the probability of default is lower because if the return on the assets is higher
then the firm is going better with respect to another firm who is using assets in less efficient way.

Now run the logistic regression and look at the solution of the excel file.
BANKING REGULATION, 21/05/2020

Loss Given Default (LGD) MODEL


At fist glance, it can be summarized into three steps:
1. Identify the sample: it is very important to assess what kind of sample we are looking for.
One is the development sample, that is the basis of inference and is made of the set for
which the full default history is observable. It is composed by all the historical observation
about default positions. In particular, these are closed positions, means that they have
already experienced a way to get into performing status, but failing, so there are already
classified as default. We can observe the entire life of the positions. The application
sample is instead made of the set of positions for which the default history has not been
observed yet and for which a forecast is necessary. Once you have observed the
development sample, you can go to the application sample and apply a weight to the
position in this portfolio in order to say that, given the characteristics of each person, the
estimated LGD would be this one. So, the difference must be very clear: the development
sample is only the historical base on which compute the estimation, but then the numbers
will be applied to the real application sample. For example, if I have a mortgage with the
bank but it is still opened, this position is not part of the development sample.
2. Making inference: this means that we observe the loss rate realized in the development
sample and we will use that estimation to make inference on other positions characterized
by the same risk characteristics. We need to identify risk drivers in order to explain that
loss. These could be for example geographical region, or the class of exposure or the type
of loan you have.
3. Applying the estimated parameters: what you have estimated is applied in this step to the
positions belonging to the application portfolio to predict the future loss rate. The
projection of the future losses is made by applying the estimated risk parameters on the
basis of the risk characteristics of the positions belonging to the application portfolio of the
bank. You split the portfolio on the class of the counterparty like public sector, large
corporate, so that you can identify the type and so estimate the risk parameters.
*intervento di Mosca* The rating dynamic is very important, there are different models with
different features but one thing that they ask to the banks to investigate is how fast your
rating process react to changes in the scenario, like the pandemic. In principle you would like
to have measure that are though the cycle, an indication of the probability of default that
reflect the mix of good and bad years, a long run average concept. This is the reason why we
insist a lot on the calibration of the pd model. In practice it is never achieved, but it is still
important to look at how fast the part react.

Given that we are already in default, we don’t care about the performing status, but only of
past due, unlikely to pay and litigation. We can differentiate among these three with different
criteria that are worst and worst when we go from past due (first stage) to litigation (third
stage). We can define the parameter of the model, that are:
• Migration rate: (this one and the next are linked to past due and unlikely to pay). This
model component is useful to model the probability of migrating from one status to
another. The migration could be in one direction, from best to worst position, or in the
inverse direction.
• Exposure variation: it is linked to the same concept, but we are not talking about
probabilities but how does the exposure change when I move from one stage to the
other
• Loss given litigation: (this one and the next one are referred to litigation status). It is
aimed to taking into account the realized loss when we go thought litigation.
• Adjustments: aimed to take into account some phenomena that cannot be perceived
just looking at the positions. Like, if there is a downturn, I want to adjust my estimate in
order to taking into account also this element.

Looking into detail.

MIGRATION RATE
We have said that this is the probability of migrating from each status to another. Starting from
performing, we can arrive to default even arriving to past due or to unlikely to pay (that are
intermediate steps) or directly to litigation. For the intermediate steps we can go even worst to
litigation or becoming again performing. There are also some counterparties that are not matching
their obligation, but than they return to pay. Once I’m sure that they are able to repay their
obligation, I can classify again them as performing.
We update the data each and every time we have to use the model from the beginning, otherwise
we recalibrate the model, including new data and rerun the same model to see how the
parameters change. Not happen more often than 1 year. We have also to update the data every
time there is a change in the rule by the European law.
We can go from performing to litigation directly if for example the debtor goes into bankruptcy or
dies and there is no one that can repay his loan.

• Sample identification: -First entry, we have to identify in which position the clients are, so
accordingly to the graph, in which branch we are. -Cure rate, when the position is in a
default position and go back to performing, starting to pay his debt. -Danger rate, the
probabilities that they go to a worst situation (litigation).
• Estimation: we can think about measuring the frequency that each counterparty move
from one status to the other.
Excel Exercise
First default classification: Past Due or Unlikely to Pay or Litigation.
Last Default classification: either Litigation or going back to Performing (-). Litigation is the only
absorbing status, because you cannot change your situation. So, the complexity is that the other
positions are not absorbed.
In real life often happen that the parties go from PD to UtP to Litigation, while in the example is
only a direct shift from one position to another.
One party that for example goes from PD (or UtP or L) to litigation is facing a Danger Rate. If
instead one party goes from PD (or UtP) to performing, he is facing a Cure Rate.
-First question: we have to count how many counterparties are actually in past due status (56).
The same for unlikely to pay.
-Second question: the past due danger rate is the ratio between the people who have a migration
𝑃𝐷→𝐿𝑖𝑡𝑖𝑔𝑎𝑡𝑖𝑜𝑛
from PD to Litigation over the total PD , how much counterparties have migrate
𝑇𝑜𝑡𝑎𝑙 𝑃𝐷
to the worst default classification.
𝑈𝑛𝑃→𝐿𝑖𝑡𝑖𝑔𝑎𝑡𝑖𝑜𝑛
For the Unlikely to pay is the same 𝑇𝑜𝑡𝑎𝑙 𝑈𝑛𝑃

-Third question: the past due cure rate is the ratio between the counterparties that are becoming
𝑃𝐷→𝑃𝑒𝑟𝑓𝑜𝑟𝑚𝑖𝑛𝑔
performing over the total PD (or 1-DangerRate) 𝑇𝑜𝑡𝑎𝑙 𝑃𝑑
𝑈𝑛𝑃→𝑃𝑒𝑟𝑓𝑜𝑟𝑚𝑖𝑛𝑔
For the unlikely to pay it is 𝑇𝑜𝑡𝑎𝑙 𝑈𝑛𝑃

EXPOSURE VARIATION
In this case do not consider the performing status, we are not talking about probabilities anymore,
but we are looking to exposure. Imagine that the exposure to my mortgage is 324 euro and I start
not paying the rate, then I’m classified as default. If I pay only 100, my exposure is changed. My
exposure can change when I go from one status to another.
The important thing is that we are looking only to default positions. This because the exposure
from performing to past due (or utp) has already taken into account in the exposure at default
(EAD) parameter.

• Sample identification: Cure rate, as before to performing; Danger rate, go to litigation. The
only difference is that we are not looking at probabilities.
Excel exercise
We look at the performing exposure when we want to estimate the cure rate and at the exposure
at last default classification for the danger rate.
The exposure at the first classification that is equal to the exposure at last default classification
means that there is no exposure variation.
If the exposure at last default classification is higher than the first, it could be that the bank is
given money to the firm that is already in default, without being aware. For the residential
mortgage, the client should repay for example 318, but he is in default, he cannot pay. When he is
classified as litigation, the amount is higher because the bank has started some process to get back
money, so they charge more cost or simply the bank adds more interests.
If the exposure at last default is lower thane the first, it could mean that he has repaid some
instalment, but it is not enough to be classified as performing, because we have some threshold to
consider. There is probation period in which the party is paying back the instalment, but I want to
be sure that he will pay also the future ones. If you do so for the entire probation period, you are
classified again as performing.

How to calculate the exposure variation? If fist=last, it is 100%


𝑙𝑎𝑠𝑡 𝑑𝑒𝑓𝑎𝑢𝑙𝑡
It is the ratio 𝑓𝑖𝑟𝑠𝑡 𝑑𝑒𝑓𝑎𝑢𝑙𝑡 that you have to calculate for each counterparties. Once you do this, you
can calculate the past due delta exposure: it is the average of each rate calculate before for the
default position, accordingly to past due initial position.
Instead the utp delta exposure is the average of all the exposure variations for the unlikely to pay
position.
This is the same calculation for the cure rate.

LOSS GIVEN LITIGATION


This allow us to estimate what is the loss rate observed at each position once these are classified
as default. These are all closed positions, means that are closed to litigation status. The bank in
this case can do a recovery process, like sell the house, something that the bank cannot do in all
the other positions. They have already done whatever they could to recover my position, but I’m
still not paying. The sample that we are considering is not the application but the development.
The formula has parameters:

• EAD, exposure at default


• Recovery realized into the entire litigation period, discounted at the day of entry in
litigation. This is all the bank is recovering from your loan. We subtract recovery, because
we are eliminating from the loss. Positive effect.
• Cost component realized at each point in time into the litigation, discounted at the day of
entry in litigation. This are cost for example the layer who should be payed to process of
selling my goods. We add the cost because we are adding value to the loss. Negative effect.
All is divided at EAD.
Suppose that we compute this rate for all the guys in the sample, then I look at all the
characteristics associated to each counterparty. We are not going to split in different module, but
we just look at different aspects of each position. For example, you look at the geographical
position in order to estimate the recovery, or you look the exposure of the loan, because if it is a
low amount, probably the bank doesn’t do any actions. So, these risk drivers are referred to both
the counterparty and the loan and are useful to estimate, discriminate the loss given litigation
component.
Each characteristic is associated to a dummy variable. I run a simple OLS regression where each
variable is a dummy variable, representing a characteristic associated to the counterparty or the
loan. Y=LGD, X=Dummy=RiskDrivers

ADJUSTMENTS
These are elements that are not clearly observable in our data to adjust the previous results.
-Downturn component: for example, at certain point in time, li 2008 after the crisis, when they
work on the model, they realize some loss rate really higher than the other realized on the entire
time series , so the idea is that they want to isolate this effect to see the impact on loss given
litigation. See what is realized in these worst observations and use it to adjust to add something
more.
-Open positions: we are talking about closed position until now, but there are other positions that
are in default even not closed. Like a debtor who has a mortgage and he starts not paying, so he is
classified as past due, then as unlikely to pay, the in litigation but the bank is still waiting for sell
his house. What do we do with these positions? We still want to know what these positions are
telling us.
We take into account the most recent position to see their loss rate. Like we detect that people
from the nord are riskier than the south. You look to the portfolio of the bank and it is composed
only by south people, so the real portfolio is really riskless. So, at the bottom we want to see if the
positions that are not closed but still exist in the portfolio of the bank. What are the characteristics
of positions that are not taken into account into the development sample but still exists in the real
applicant portfolio of the banks.
*risposta alla domanda di luca riguardo l’applicazione di questi modelli anche per altri scopi* All
model talked until now are all for capital estimation purposes, but some transformations are also
used for managerial purposes, like pricing the mortgage, the interest. These can be used also for
provisioning, so how much is the expected loss component of the bank or identify some target of
recovery. For instance, sometimes the bank assigns the recovery target to the agent that is going
to assign the recovery activity on the basis of the actual recovery estimated though the model.

-Disposal: once the bank has accumulated a lot of non-performing loans, it comes to you and says
that you need to sell some of them to other recovery companies. In this case the bank realizes a
huge loss rate, because they have to sell with a low price and so a low recovery associated to that
position. The disposal is not comprehended in the estimation because it is likely a strange position,
associated to loss rate strange and for this reason we are taking into account after to not lose
these information.

You might also like