SlideShare ist ein Scribd-Unternehmen logo
1 von 15
Giulio Laudani                                                                                           Cod. 20263




            Scheme on CDS and Copula
Empirical data features: ____________________________________________________________________ 2
  Univariate world:________________________________________________________________________________ 2
  Multivariate world: ______________________________________________________________________________ 2

Multivariate insight: ______________________________________________________________________ 2
  Measure and test _______________________________________________________________________________ 2
    Dependence Measure: ____________________________________________________________________________________ 2
    Test on distribution: ______________________________________________________________________________________ 4

  Gaussian multivariate correlation __________________________________________________________________ 4
  Other multivariate specification ____________________________________________________________________ 4

Copula function __________________________________________________________________________ 5
  What is a copula e when it is suitable _______________________________________________________________ 5
  Sklar’s theorem [1959] ___________________________________________________________________________ 6
  How to generate a Copula: ________________________________________________________________________ 6
  Type of copula function __________________________________________________________________________ 7
    Implicit types: ___________________________________________________________________________________________ 7
    Explicit types: ___________________________________________________________________________________________ 7
    Meta-distribution:________________________________________________________________________________________ 7

Risk modeling ____________________________________________________________________________ 8
  Goals and features: ______________________________________________________________________________ 8
  Approaches: ____________________________________________________________________________________ 8
  Comments and further details: _____________________________________________________________________ 9

Pricing Multi-asset derivatives _____________________________________________________________ 10
    The BL Formula _________________________________________________________________________________________ 10
    The Practitioners’ Approach: ______________________________________________________________________________ 10
    The copula approach: ____________________________________________________________________________________ 10

  Credit Derivatives focus: _________________________________________________________________________ 10
    Market practice: ________________________________________________________________________________________ 11
    Pricing a CDS: __________________________________________________________________________________________ 11
    CDS typologies and CDO financial product: ___________________________________________________________________ 14


                                                                                                                  1
Giulio Laudani                                                                                                   Cod. 20263
Empirical data features:
We are going to use as an explanatory model of return evolution the one that is assuming the presence of two compo-
nent: the permanent information and the temporary noisy one. The second one prevails in the short term high fre-
quency observation, while the first emerges for longer horizon. This property implies:
        In general, predictability of returns increases with the horizon. The best estimate for the mean of returns at high
        frequency is zero, but a slowly evolving time varying mean of returns at long-horizons could and should be mod-
        eled. [there is a strong evidence of correlation between industrialized countries stocks]
        If we run a regression we are expecting high statistical significance for the parameters for log horizon
        The presence of the noise component in returns causes volatility to be time-varying and persistent, and the an-
        nualized volatility of returns decrease with horizon
        Data shows non normality behavior, unconditional distribution with higher tails
        Non-linearity is also a feature of returns at high frequency: a natural approach to capture nonlinearity is to diffe-
        rentiate alternative regimes of the world that govern alternative description of dynamics (such as level of volatil-
        ity in the market), for example Markov chain.

Univariate world:
Those time series have little serial correlation, high absolute return correlation, excepted conditional return are close
to zero, volatility appears to change over time, leptokurtosis and skewness are main features and extremeevent ap-
pear in cluster (thus it can predict), lastly long term interval converge more to Gaussian hp

Multivariate world:
Little correlation across time, except for contemporaneous returns, strong correlation for absolute returns, correla-
tion[to compute the correlation we should fit different models for changing correlation and make a statistical compari-
son] vary over time (and clustering effect), extreme returns in one series are correlated with other returns (extreme
dependence)


Multivariate insight:
This topic is relevant for pricing. Asset allocation and risk management issues. When we dealt with multi securities we
need to model not only the dynamics of each securities, but also the joint evolution, i.e. their dependence structure.

Before starting to describe the first approach we need to provide some notation that will be used here after:
        Consider a d-dimensional Vector X the joint distribution is
        The marginal distribution will be
        If the        [marginal distribution] is continuous we can obtain the densities by computing the first derivatives,
        so the joint densities will be the non-negative function
        Note that the existence of a joint density implies the existence of all the marginal densities but not vice-versa,
        unless the components are orthogonal one to the other

Measureand test
Dependence Measure:
The Linear correlation is the natural dependence measure only for multivariate normal (it will coincide with the maxi-
mum likelihood estimators, hence it will satisfied all the desirable properties) and when the variance is (must be) a finite
number (not obvious, insurance business). It depends on marginal distribution (since it is computed with moments,
                                                                                                                           2
Giulio Laudani                                                                                                   Cod. 20263
Hoffding formula                                         )), hence it is not invariant under more general transformation (it
is still invariant under liner transformation) and it wouldn’t capture the correlation dynamic under more generic func-
tion.

The Rank Correlations are dependence measure that depend only on the copula and not on marginal, so it is invariant
under general increasing transformations. To compute the rank correlations we need to know the ordering1 of each va-
riable, not the actual numerical value ( we are assuming continuous margins for simplicity) : there exist two measures:
         Kendall’s tau, we compute the number of “c” concordant                         and “d” discordant pair between
        two variables with n variables.           sample version. The population version simply consider the probability of

        concordance minus the probability of discordance
                                              , where the two are vectors with iid distribution
        Spearman’s rho [sample version], we need to compute the rank variables (the same time series are ordered) of
        the two variables          and then compute the correlation between these new variables            . The popula-
        tion version is the correlation of the marginal value
Here in the following we will provide a proof on the dependence of the estimators only on copula distribution function:
        The Kendall’s rho                                                                                    , since the sum of
        the two is one, we can write the one as a function of the other
                                        .    We     know      that
                                         since the two variable are i.i.d. this last term is the joint distribution, which can be
        expressed as an integral form                                  thanks to the Sklars’ theorem we can rewrite
        the previously equation as function of copula distribution (by change the integral interval with 0,1).
                             hence we have proven that the estimator depends on         .
        The Sperman’stao population estimators is                    . The covariance formula is
                               , hence we can rewrite it as a function of copula                                             . The
                            so if we put together all this findings we ends up with

There exists also a multivariate version of the rank correlation measure, where instead of taking the expected value
(population version) we will take the “cov” function or correlation matrix depending if we were using the Kendall or
Spearman measure, hence the estimator will be at least semi positive definite. Both the measure can assume the value
in the interval [-1,1] and 0 is independent.

The tail dependence measure has been set to judge the extreme dependence between pair of variable (it is hard to ge-
neralize to the d-dimensional case), the idea behind is to limit conditional probabilities of quantileexecedances. The up-
per measure is given by                                           and the value are in the interval [0,1], the opposite for
the lower tail                                          . Those formula can be written s function of copula (this measure
depends only on copula, hence it is a proper dependence measure), by applying the Bayes theorem




1
  When we refer to the ordering of a variable we need to introduce the concept of Concordance and Discordance. In the first case it
is meaningful to say that one pair of observations is bigger than the other (without use the Probability distribution)
                                                                                                                                 3
Giulio Laudani                                                                                            Cod. 20263
Some example are provided: the following Gaussian is asymptotically independent in both tails, t-copula has a symme-

tric (thus not good for real data features) dependence                                  , where higher value of v degree of free-

dom will bring an higher tail dependence. The Gumbel copula has upper tail dependence                      , and the Clayton copula
has lower tail dependence is

Test on distribution:
To test univariate distributionhp we can use the QQ plot technique to have a graphical sense of the validity of the hp.
To have a numerical sense we can use the Jarque-Bera test which is based on the joint check of the skewness and kurto-
sis of the sample.

To test multivariate distribution we need some other tools, specifically set up. The first is based on a quadratic form
where we will compute for each observation the                               , which should be distributed according to
a chi-square with d degree of freedom , which can be seen graphically with a QQ plot or numerically with the Marda
Test(by using the D, third and fourth moments, the first distribute as a chi-square with degree of freedom [                         ],
the second follow a standard Gaussian) which is basically a ri-proposition of the jarque-Bera idea, however it is not a
joint test, since we test the skewness and kurtosis separately.

Gaussian multivariate correlation
The first and most trivial measure of dependence is the correlation or equivalently the variance covariance matrix. It is
at least a semi-definite matrix, and if it is positive-definite it is possible to apply the Cholesky decomposition. It is usually
computed with the sampling estimators2.

The Gaussian distribution is the most simple hypothesis, it is characterized by the first two moments, which will entirely
describe the dependence structure, linear combination remains normally distributed and it is easy to aggregate different
distributions (we just need to know the correlation between them).

There exist several closed formula under the geometric Brownian motion in the derivatives pricing and it is simple to
manage the risk by using the VaR approach.The Gaussian world is unfeasible with the empirical evidence, it has a poor
fit with real data3, hence we need to develop a new set of model which allow for fat-tail and more flexibility in defining
the dependence structure. Note that a multivariate set of variable to be jointly Gaussian, each of the univariate distribu-
tion must be Gaussian if self.

Other multivariate specification
The first attempt to solve the problem was to work with the conditional distribution, such as GARCH model              ,
where we are hoping that the rescaled return by volatility is normal. Even if those model performs better than the clas-
sical one, we still need something more complete model, furthermore it is a really demanding specification for multiva-
riate needs4. The second attempt was to use different distribution with fatter tails, such as the t-Student’s

2
  We are implicitly assuming the each observation is independent and identically distributed, in fact only in this case this estimator
will coincide with the maximum estimators and have all the desired properties, furthermore this estimator will depend on the true
multivariate distribution
3
 We observe fat tail both in the marginal distribution and in the joint distribution, furthermore the volatility seems to move in clus-
ter, while return are less time dependent, and correlation vary over time
4
 We need          elements for each securities in the matrix to be modeled, where d is the number of coefficient in the algorithms
specification. The same problem for the ETV
                                                                                                                                     4
Giulio Laudani                                                                                                 Cod. 20263
tion5(note that each marginal distribution must be t-student with same characteristic) or more advance parametric spe-
cification: Normal Inverse or Levy process. The third possibility was to model just the extreme distribution behavior EVT,
which is very popular and useful in the risk metrics word, less when we need to price (in those cases we need the whole
distribution)

A totally different approach to the problem was to the dimension reduction techniques, the general idea is based on
the empirical observation that a limited number of common factors explain most of the dynamics. The simplest way to
achieve such a solution is the PCA (here the Cholesky decomposition plays the role).

To conclude all of those proposal model a joint distribution without any insight on the marginal distribution (or a least
do not allow them to be different form the one accepted by the joint distribution chosen), hence it isn’t a flexible ap-
proach. It will by far more interesting to try to put together different marginal distribution which will properly fit the
empirical data and then set the joint distribution, like in a building up approach; this is the Copula method which will be
treated in the following section.


Copula function
This approach is preferable for its higher flexibility: the bottom-approach allows to independently control/specify an ap-
propriate distribution for each factor and the joint dependence structure. Other important properties are independent
to scaling effect since it works on quantile, it is invariant to strictly positive transformation.

What is a copula e when it is suitable
A d-dimensional copula is a distribution function on the unit hypercube         with standard uniform marginal distribu-
tions. We denote copulas as                                 and it must follows those properties (otherwise we cannot
specify the copula):
    1.                is increasing in each component.to be proven you have to perform the first derivatives
    2. If at least one component u=0 the Copula function must be equal 0
    3.                     meaning that if each other marginal function is realized the copula function result will the i-
         esimomarginal distribution. To prove, change the other variable with 1 and let the variable to change.
    4. The rectangle inequality                                        for all                       with
             o In a bivariate case it simply means that for every a1 < b1 and a2 < b2 we have: C(a1, a2) − C(a1, b2) −
                 C(b1, a2) + C(b1, b2) > 0

Properties (1) and (3) are required to have a multivariate distribution function, and to ensure that the marginal distribu-
tions are uniform. Property (4) denotes that the copula function is d-increasing. If a function C fulfills these properties, it
is a copula6.




5
 Since it is an elliptical distribution it enjoys most of the Gaussian properties. The most important which is shared among all the el-
liptical distribution family is that the normalize value is distributed           v= degree of freedom and d= # securities, i.e. depends
only those two values. Furthermore it is closed under convolution (if two variables has the same dispersion matrix we can add them
to obtain a new distribution, by assuming independency) Note that to fit fat tail distribution we may ends up with few degree of
freedom
66
   Also, for 2 6 k < d, the k-dimensional margins are themselves copulas.
                                                                                                                                      5
Giulio Laudani                                                                                                 Cod. 20263
Sklar’s theorem [1959]
This famous and import theorem states that Copula can be extracted from a joint distribution with known margins and
that the copula uniqueness is granted if marginal distribution are continuous, vice versathe space of existence of the
copula is the multiplication of the marginal distribution range. The proof relies on the generalized inverse theorem.

The first part of the proof is, by assuming the existence of the joint distribution function
                         with margins          , then exist a Copula                                              , where
             (by assuming continuous margin), by substituting we have                             .

The second part given a Copula (unique) with given margins, we want to prove the existence of joint distribution with
the previously defined margins.
    1. We take the vector U with the same distribution of the Copula
    2. Then we compute the variable X defined as                        , where the F are the inverse marginal distri-
        bution (used to build the Copula).
    3. We then write the distribution function of the variable X
                                                                                this last term shows the equivalency of the
        distribution X (joint distribution) to the Copula, if the margin are assumed to continuous.

Another important result of the theorem is that Copula could be extracted by multivariate joint
tion                                        .Copula is bounded between the Fréchet limit: countermonotocity
(                 ) [it is a copula only if d<2] and comonotocity       , which are two copula function itself.

How to generate a Copula:
Structural model proposes as correlation the asset correlation, basically the equity correlation. Intensity based approach
propose to use for a given time horizon                        , unfortunately not enough data available for joint obser-
vation. Thus has been proposed a MC algorithm called Li’s model (2000), where by using the hazard rate bootstrapped
from CDS term structure and a given correlation taken from asset correlation, however it is computationally intense
since default events are rare.

If we start form a given copula distribution (the first example will be a Gaussian one) We need to decompose via Cho-
lesky decomposition the correlation matrix             (we need to specify a procedure to produce this matrix). We need
to run d-random variable Y and multiply them with A to obtain the variable Z (where A will contribute to give the corre-
lation structure) we will use this variable Z to compute our distribution given a chosen Gaussian copula. To run a t-
copula we need to add a step, basically we need to multiply the variable Z by          where v are the degree of free-
dom and s is a random variable generated by a chi-square distribution with “v” degree.

How to fit a copula on empirical data:
A different problems to fit empirical data into a proper copula dependence structure specification, there exist several
methodologies available:
        Full maximum likelihood where we will estimates both the copula and margins parameter all together, this me-
        thod is of course the more accurate and unbiased, however it is tremendously intense.
        IFM or inference function for margins, where we specify a parameter marginal distribution, which will be used
        to compute the cumulative probability of the empirical data set, then we will used those probability to run a


                                                                                                                         6
Giulio Laudani                                                                                            Cod. 20263
         maximum likelihood to estimate the copula parameters. This method highlydepends on the marginsspecifica-
         tion, hence it is quite instable
         Canonical maximum likelihood is similar to the previous one, however we are going to use empirical marginal
         distribution, we are neutral on this side, while we are more focus on the joint dynamics
         Method of moments (used by MatLab), basically we are going to use the empirical estimates for some rank cor-
         relation measure, which is assumed to be our lead indicator/driver

Type of copula function
All those copula function can be expressed in term of survival probability, they are named survival copula. An index of
the flexibility of the copula dependence structure is the number of parameter in the algorithms, the most flexible pre-
sented is the t-copula (degree of freedom and the correlation)

Implicit types:
In this class of copula belongs all those classes which do not have an explicit formula. The Gaussian [], where the inde-
pendent        and the comonoticity                        , defined only for the bivariate caseare special case of the Gaus-
sian for correlation value equal to 0 or 1. It depends on the correlation variable only, which is a matrix.

The t-student is another example. Note that in this case we do not require the margin to follow a specified distribution

Explicit types:
Closed formula are available for this class of copula. The Fréchet copula is the combination of the three fundamental
copula: independent and the two limits (each of those is an explicit copula). It is a sort of average with coefficients β,
α.We can obtain copula as a linear combination of this fundamental copula as follow
    1,…+ ( 1,…), the beta and Alfa describe the dependence structure



Belonging to the Archimedean class we have:Gumbel Copula defined by the equation

             Where the variable is the only parameter (not really flexible), it is bounded between 1 and infinite. At
extreme it converge to the independent and to comonotonicity copula. Clayton Copuladefined by the equation

                   Where the variable is the only parameter (not really flexible), it is bounded between 0 and infinite.
At extreme it converge to comonotonicity the and to independent copula

Meta-distribution:
In this case we not directly apply the Skalar’s theorem. At first we will generate the random variable U (cumulative
probability) form a specified copula function (which is the one that we want to use). We will use those generate va-
riables into a specified marginal inverse distribution to generate our random variables. Basically we have run the apply
the inverse procedure form the top to the bottom

An example is the Li’s model where Gaussian copula is used to joint together exponential margin to obtain a model for
the default times of copula when these default time are considered to be correlated.




                                                                                                                           7
Giulio Laudani                                                                                                    Cod. 20263

Risk modeling

Goals and features:
As a risk manager we are interested in the loss distribution 7function defined (in general) as                      where
the randomness come into with the variable “t”, basically the loss is function of time. Generally speaking we are interest
in the function       which defined the capital absorbed (regulators).

Those numbers may be used for: Management tool, capital requirement and performance measuring adjusted by risk. to
perform those tasks we need a model to predict asset value evolution, the most used one is a factors model mapped on
specific securities’ features. However this approach has the limit to assume a constant composition of the portfolio and
the prediction is made by a linearization approximation, which might work only for small factor changes and a negligible
second derivatives effect.

Risk management may use unconditional or conditional distribution, where the later are preferable to assess a dynamic
environment, where new information come into through time. The unconditional distribution are used for time interval
greater than four months, since the time dependence of the return start to be negligible. It is a crucial task to under-
stand the data, needs and tools available in the daily RM routine.

Approaches:
Notional is the most used by regulators, since is very simple. The general idea is to weight the entity asset by specific risk
level. This method has severe limits since there is no diversification or netting benefits and to value derivatives position
it is wrong to account the national value since the transaction is anchor to a specific payoff.

Factors model follows a delta-gamma approach. It is not comparable across different asset class, it doesn’t allow to con-
sider overall risk (you need to model the correlation of each factors, no sense in put together delta and Vega effect),
however it performs well when judging well specified event

The most used Loss distributionmethod is the VaR approach.For a given portfolio, confidence level and time horizon,
VaR is defined as a threshold value such that the probability that the mark-to-market loss on the portfolio over the given
time horizon exceeds this value (assuming normal markets and no trading in the portfolio) is the given probability level.
The limits are: we do not have a clear and fully agreed parameters calibration 8confidence level an time horizon, that’s
why the regulators force you to use a one day @ 0.99 parameters), it is not always sub additive, this problem get more
and more severe when we dealt with non-elliptical distribution, it didn’t give us any insight on the tail (after) possible
worst outcome loss
The ES will overcome those issues (It is a coherent risk measure), however is tough to be computed, one solution is to
slice the tail distribution into equal probability occurrence, compute the VaR and make the mean, however for non-
parametric distribution the observation might be two few. This method can be inferred from parametric, historical and
MC distribution, however they are intensive on the computational side.
         Market risk
         Into the credit risk definition we have the Credit risk itself, migration risk and
             o Riskmetrics, where we will generate a whole migration matrix to be used


7
 Our interest in loss is related both to the job performed by risk management, and to the relevance and well defined academic
modeling
                                                                                                                            8
Giulio Laudani                                                                                           Cod. 20263
            o Credit view idea is to use macroeconomic prospective to correct the risk measurement, note that if the
               time horizon used is a point in time there is no need to adjust by economic cycle, in this case there
               would be a double counting

Scenario based approach suffer of misspecification, it is good only for well-defined problem (factors) and it is not easy
to compare with multi asset securities.

Variance was used as a risk measure, however it highly depends on the hp distribution (only symmetric, since deviation
are equally weighted), should exist. Possible solution are partial moments, which is the base idea of expected shortfall.

Comments and further details:
Backtesting is really important. The idea is to check how many time the measure fail over time, it is a pointless/daunting
task to run Backtesting on ES, since errors on VaR are few and average reduce the effect.

The scaling problem is undertake with overlapping or with separated time interval. The first will allow to do not lose da-
ta point, but it introduces serial correlation on the data, the second reduce the data that can be used. The convention to
rescaling by multiplying with a time factor is correct only if the data are iid, that is not the case

The stress test must always be undertaken by risk manager this technique is a perfect complementary tool to track the
portfolio risk dynamics.

Any risk measure should fit the coherent risk measureproperties:


Some comments on the major characteristic of credit models:
       Default-mode versus multinomial one, where only credit risk + belongs to the first
       Future values vs. loss rate, meaning that the model can be based on the distribution of possible value or possi-
       ble future loss, the first use as input the spread curve by maturity, while in the second the spread it is not neces-
       sary to be known. Creditmetrics is a typical market value model, while credit risk + loss one
       Conditional vs. un-Conditional, portfolio views belong to the first, however this distinction is useful only if the
       model works through the cycle
       Monte Carlo vs analytical solution
       Asset correlation vs. default correlation, it’s less important than other, in fact they are close to each other. Cre-
       ditmetrics is belongs to asset correlation, while credit + to the second one.




                                                                                                                          9
Giulio Laudani                                                                                                     Cod. 20263
Pricing Multi-asset derivatives
The finance cornerstone in pricing any derivative is                                             , where our goal is to com-
pute a meaningfully and arbitrage free measure of the function f(x).

The multi-asset derivatives family is the one where the payoff depends on more than one underlying. CDS by itself is not
a multi-asset derivatives, however there exist in the market some example of derivatives protection against default of
more than one underlying. The most used multi-asset payoff types are: Best of, Worst of and Basket option. There exist
three possible framework to be used to price those derivatives, in the following section there will be a brief description
of all of them, however our focus will be set on the Copula one.

The BL Formula
In this set of hypothesis the motion used is the multivariate geometric Brownian one, where the correlation between
asset is modeled into the stochastic (diffusion) component by the following relationship                    . In this
framework we can obtain a closed formula, however the distribution is lognormal and the parameters are assumed to
be constant, hence the risk-neutral density for the log returns is normal, not really feasible.

The Practitioners’ Approach:
It is usually introduce some more sophistication into the pricing formula, such as a stochastic volatility and jump diffu-
sion process for individual random variable. At first we need to calibrate the models’ parameter for each asset and then
we will estimate the correlation matric via historical data. The last stage is to use a Monte Carlo simulation techniques to
compute the price.

This approach is really flexible and consistent (Rosenberg states that it is meaningful to mix an objective dependence
structure with univariate distribution) with univariate pricing, however the use of the linear correlation estimators is a
poor fit to describe dependence structure.

The copula approach:
Given a copula function we can easily compute the joint density function as
where the right term is simplythe joint derivatives of the copula function             . This approach is an extension of the
previously one, since we are modeling more sophisticated dependence structure via copula.

With those information we can recall the pricing formula of an European derivatives with general payoff
                                          , by recalling that the joint density is a function of the marginal densities (easily
computed form market data) and risk neutral copula (not easy to extract form market data), however it has been proven
that historical one is equal to risk neutral one under fairly general hp, which is to use affine transformation.

This approach is really flexible, suitable for both parametric and non-specification and furthermore can be extended to
add more sophisticated dynamics. Note that we are still assuming a static approach, where the parameters are not let to
change over time, and the information derived from historical behavior.

Credit Derivatives focus:
Within this class of derivatives securities belong any instrument that enables the trading/management of credit risk in
isolation from the other risk associated with the underlying. Those instruments are traded over the counter, hence it is
very flexible, however they have been standardized in the recent year to enhance the liquidity.


                                                                                                                            10
Giulio Laudani                                                                                             Cod. 20263
They have been used as risk management tools and as a trading tool to synthetically take exposure on creditworthiness,
in 2012 the whole CDS market (single and multi-name) has a market value equal to 1.666$ billion, this market account
for 72% of the whale derivatives notional amount.

Regulators have add new requirement for accounting/capital requirement for those classes of assets, CVA (risk of coun-
terpart). The elements of this “new” risk are: it is a highly skewed for protection buyer, the buyer riskiness should take
into account (bilateral), the maximum loss is the MTM at default. Possible solutions to hedge this risk are: netting, diver-
sifying the counterpart (not feasible for CDS derivatives), posting collateral (illiquid, require time to acquire) or clearing
house (pricing of those instruments will account for cash). It may be priced we will use the present value of swaption for
each nock (default time) multiplied by the probability of default bootstrapped from CDS. [with structural or reduced ap-
proach] this solution is justified by the economic intuition of the kind of risk undertaken by one considering the position;
basically a default of the counterpart will cause a loss on a random time in the future of a given amount we can hedge it
by buying a swaption so that a random date into the future we will have the option to protect our self against this credit
event. This kind of risk is defined as a Wrong risk, since it will always move against.

Market practice:
CDS is an insurance contract through which one party provides protection (protection seller) against credit risk of sin-
gle/multi entities in exchange of periodic premium payments. These premium are paid on a regular basis (normally
quarterly) until the credit event occurs or till the CDS expiry8.

The spread traded in the market is the annualized premium under the convention act/360, expressed in basis points.
The recent convention is to quote at standard spread (100bp for investment and 500bp for junk bond, US convention)
(European convention has more than two nocks[]) corrected with upfront payment which varies according to the default
risk. In the contract it must be clearly specified which is the credit event considered as a liable at starting date by the
protection seller, furthermore it must be clarify if there will be a cash settlement9or a physical settlement10, which is the
most common11.Possible events are bankruptcy, failure to pay or any restructuring event.CDS starts usually 3 days after
the contract date, spot contract, if longer it will be a forward starting CDS, when the protection seller receive the notifi-
cation it has 72 days to deliver the Cheapest to deliver securities

Pricing a CDS:
CDS market is so liquid that the pricing is led by demand and supply, however there exist two main pricing formulas
needed to price more exotic payoff, for unwinding12 existing CDS positions or in general when we need to assess the
marked to market value of existing CDS position. To perform those tasks we need to specify a model we cannot simply
take the difference between spread, since the credit event is random.

The whole set of possible models available to price a CDS need to properly define the default probability of both the
reference entity and protection seller, their correlation; any cheapest to delivery option granted, the maturity of CDS
and the expectedrecovery rate.




8
  The most liquid maturity is the 5 years, followed by the 3,7 and 10 years
9
  The seller pays the buyer the face value of the debt minus the recovery rate calculated on the basis of dealer quotes
10
  The buyer delivers any deliverable obligation of the reference entity to the seller in return for the notional amount in cash
11
  There exist default digital swap agreement, where the amount paid by the seller is pre-determinate
12
   To close an existing position there could be three possible solutions: pay any difference to the originator, sell the position to
another counterparty (so the buyer is replaced) and entering in an offsetting transaction with another counterparty
                                                                                                                                 11
Giulio Laudani                                                                                                Cod. 20263
The first model is the Asset Swap Approach, which intuition lie on the possibility to replicate the CDS payoff with other
trader securities, i.e. an interest swap agreement plus buying/selling the underling bond financed/invested in the repo
market. This approach may sound equal to the CDS poor strategy, however the position on the IRS is not automatically
cancelled in case of default13, that why there is a basis between the two strategies which is equal zero in the long term
(average). Some hedge fund will tray two trade on the two market to make money out of it, by believing in a sort of re-
verse behavior in the long run.

The second one is the Default probability models family, which is divided into structural models and intensity
based/reduced form models14. The first try to estimate the default as a consequence of the reference entity characteris-
tic, limits are that the data (usually accounting based) are not really reliable/transparent, it is not enough flexible to fit
the market structure (there is no fitting procedure, we should believe that our model deliver the same result of the
market (no boundary to constrain the model)) and it is not easy extendable to price CDS. The second focused on model-
ing the probability of the credit event itself, those models are extendable, flexible and the most used one (Tumbull &
Jarrow, 1995).

Those last two models consist on finding into thebreak even CDS spreadequation (latterly defined) the survival probabil-
ity. The break even CDS is found by exploiting the relationship between the premium and protection leg, which must be
equal at origination. Hereafter there are the formulas used to value the two positions.We need to find out the Market to
Market value of the protection and premium legs:
         The MTMis equal to                                  [if we are long, negative sing the short position[where the
         last term is the expected value of 1bp premium till default or maturity, whichever is the sooner. To assess the
         value of the RiskyPV01 we need to understand how we can offset the position:
              o Ask the original counterpart to close the position or be replaced by other market participant, in this case
                 all the P&L is taken immediately
              o Create another strategy that allow to receive the spread difference, in this case the P&L is taken during
                 the remaining life, and the payoff is random, since if the default will came before maturity the flow will
                 end. Note that there won’t be any default risk since the deliver if offset
         The PV01 must be NA and we need to take into account the riskiness/occurrence of each premiums

The Model
Desirable properties of the model are: capture the credit risk of the reference entity; model the payment as percent of
FV, modeling the timing, be flexible ad NA and simple, given the respect of Bid-ASK constrain(model value are within
that spread) and by controlling for the time burden.

The present value of the premium leg will be PMLPV + PVAP, where
where           is the contractual CDS spread,               is the day count fraction,       is the Discount factor and
      is the arbitrage free survival probability from 0 to the N payment date. This formula is ignoring the effect of the
premium accrued15 form last premium payment to credit event, to consider this effect the formula will be



13
  Furthermore there other factors to be considered: [Fundamental] different payoff, different liquidity, different market participant
and possible friction in the repo market; [Technical] cheapest to deliver option, counterparty risk differential, transaction cost to
package the asset swap
14
   The difference is in how the probability measure in the formula has been computed
15
   If the accrued premium is not considered the value of CDS is always higher, this effect depends on the RR and default frequency, if
we assumed a flat structure the effect is represent as follow
                                                                                                                                   12
Giulio Laudani                                                                                              Cod. 20263
                                               , which is hard to be computed, so we will approximate it with
                                                        which is an average.

The present value of the protection leg16 is                                          , we are assuming that the payment
is done as soon as the notification is sent, the formula’s idea is to compute the probability of each payment for any time
interval (infinitesimal), it is simplified with a discrete proxy, where twelve nocks per year is a good proxy17.The CDS
Breakeven spread will be                                           . This equation have more variables than equation, so we
will use an iterated process (bootstrapping approach) form short maturity to longer one. The


[]bootstrapping number of variables

The RR is computed with two possible approaches: use rating or Bond price. The first one has limits related with the
rules used to compute rating (American biased, different definition of default).The second is to extract info form bond
price, however it is not good for good name, where the effect of the RR is negligible

The intensity-based model consists on modeling the probability of credit events itself as a certain jump process, like a
Poisson counting process or more sophisticated one, where the credit event is the jump occurrence. This class is elegant,
very easy to be calibrated, however there is no clues on what is driving the default, so we cannot know how far the
company is from the credit event. By calling with        the intensity measure we can assume COX process to model
how it varies as new information arrives(continuous) from the market, so that the survival probability will be
                 , or alternatively we can use a time varying deterministic intensities where the lambda is made piecewise
constant and the survival probability                         or lastly by assuming that the lambda is constant through
time and independent from RR and interest rates, i.e. the hazard rate term structure is flat for all maturity. When we
have decided the term structure of lambda we can compute the CDS spread, on the other hand from the market CDS we
can bootstrap a term structure of hazard rates and survival probabilities.

The two authors used a Poisson counting process to predict credit event which occur at time τ
        , so the probability measure depends on a time dependent function (hazard rate) and time interval. The hazard
rate are usually assumed to be deterministic. The hazard rate risk neutral are higher than the historical one, since the
first will account for more factors (liquidity for instance). Note for inverted curve it could be a case to obtain negative
hazard rate, this may be an arbitrage (model dependent or independent)

Our model needs, to compute the hazard rate, the interest rate structure and a sound estimates of RR, we also assumed
to deal with continuous function, discretized to ease the math.

The structural models are based on the general idea that the firms’ value follows a stochastic process and when this
value goes below a certain amount the default occurs. This class is intuitive and linked to tractable value, however we
need accounting data18. The most simple example is the Merton model (1974), where the CDS is the overhead of the
effective return on the investment and the significant risk free.


16
  We were assuming that the number of possible credit event is finite, in this case one per month
17
   The proxy is r/2M difference, assuming flat structure
18
  Recent papers show that it is not consistent to really upon balance sheet data, since they are dirty measure and may not consider
real transaction and data. Those papers had shown a superior information ratio from rating agency outcome
                                                                                                                                13
Giulio Laudani                                                                                                    Cod. 20263
The passage to arrive to this relation is to set the company values as the sum of Equity (which is described as a call) and
the debt position which is                  , where “r” is the risk free since we are hedging our position with the short put.
Economic principles told us that the risk free position made by shorting the put is equal to the risky present value of the
notional debt amount, by exploiting this we will compute the risky return as function of the previous findings. Note that
the volatility value is the implied value of the stock market value (reversing the BL formula)

This model is over-semplicistic, in fact it has a poor fit, especially for investment grade bond or low spread period, the
academia has come out with lots of possible extension. The structural model allow to find a “fair CDS” to be compared
to the market data, so that to start possible trading strategy, the name of those strategies is relative value.

CDS typologies and CDO financial product:
A basket or Index CDS is built by considering the most liquid securities and dividing into investment grades nocks (for
example the iTraxx 125 is a proxy of the best 125 European and Asian companies, in North America and emerging mar-
ket we have the dow jones CDX).

Those index are highly standardized: Every six month a new index is launched (20 march and September), hence on the
run index are most liquid one (note that the index will continue to exist till all the company into the basket will default),
those index are divided into sub sector activity and credit worthiness (investment, cross over and High yield [typically
the lower grade considered is B]). If one company default the buyer will continue to pay the premium on the remaining
notional and he will receive the n-defaulted notional. Each reference entity is equally weighted, hence the premium paid
is close to an equally weighted average of the single name CDS.

The payment is settled every quarter and the difference between index spread and single name average is called index
skew and it is due to different liquidity in the instrument, where the first has an higher one. Index are more used since
they have lower bid ask spread, diversification, standardization.The correlation won’t pay any role since premium and
payment depends sole on each underlying asset

The CDO securities are bonds issued on basket of other asset which benefit the cascade or trenching credit enhancing,
to price this securities we need to define a loss function. Related to the previous point there are the definition of the
tranche default probability                    , where the first is the cumulative loos function
                       and the second is             . Basket securities may have a n-to default optionality, where the pro-
tection seller is liable to pay only when the n reference entity has defaulted. The CDO pricing is

We can extract the implied correlation used to price the CDO, it is suggested a one factor model with LHP approach [].To
bootstrap the implied correlationcompound and based correlation []

The first is assuming a flat correlation that reprise each tranche to fit market value (we can use historical data), the re-
sult shows a smile, higher level for equity (lower implied correlation than historical one, but high sensitivity) and senior
tranches; mezzanine are cheap in correlation due to the higher demand. The second consists on set the base tranche
with the equity one, the other are bootstrapped and are long combination of the equity one; the correlation in this
model is an increasing function (winner take all, while the senior bear the “hazard risk”)

CDO is a security backed with a different pool of debt obligations (loan, bonds, other CDO and other structured product).

[add the process of origination]

[ad the tranching process and benefits]
                                                                                                                           14
Giulio Laudani                                                                                              Cod. 20263
CDO are sold to remove asset from balance sheet and to refinancing the originator and to exploit the arbitrage granted
by the tranches pricing system, creation of value by credit risk transformation.

To price CDO tranches we need to define: the attachment point19, tranche width, portfolio credit quality, RR, maturity
and default correlation. From those elements we need to define the Loos distribution, which features are to be highly
skewed (the more, the lower the correlation), monotonically decreasing when the correlation increases and U shaped
(the higher the correlation).

[]

Sensitivities:
A single name CDS is affected by:
         Changes in the CDS spread
         RR estimates hypothesis
         Counterparty risk

A more structured product is affected by the correlation between reference entity and change in CDS spread both in
index and single name.

Hedging:
At first we need to decide if we are going to use index or single name CDS, later one we need to compute the delta
hedging (numerically defined as the ratio between the product and its underling percent change given a change in the
CDS spread).

Each tranche has a specific hedge ratio, it is higher for equity tranches, lower for senior one, since the likelihood of
losses absorbed by the junior tranches. Furthermore the cost depends on the CDs used (index is by far more costly), sin-
gle name CDS are more costly for equity tranche if the reference entity is riskier, the reverse for the senior, the econom-
ic rationale is that an increase of the CDS spread for good quality entities is a signal of overall deterioration of the
backed securities. Making more probable a loss for senior tranche, basically all the entities is going to default all togeth-
er.

The second derivatives sensitivity to CDS spread is negative for equity and positive for senior, while mezzanine depends
on the CDO capital structure. The economic rational is: senior are more sensible to deterioration of the credit quality
since they might be affected.

Those methods highly depends on model assumption on valuing the position and the sensitivity, making them really in-
stable and costly (dynamic rebalance is frequent due to the instability of the estimation), moreover the market still
doesn’t provide a consensus on the most appropriate model. Hence we may account for model, miss specification risk

Sensitivity to correlation (change in MTM by a change of 1% of correlation) is related to the gamma one, since equity
are long (winner, win all), senior are negative (riskier scenario in case of default clustering), mezzanine are typically neg-
ative




19
 That is the lower and upper bound of the loss for each given tranche
                                                                                                                           15

Weitere ähnliche Inhalte

Ähnlich wie Scheme on modeling dependence between financial assets

Guide for building GLMS
Guide for building GLMSGuide for building GLMS
Guide for building GLMSAli T. Lotia
 
Machine-Learning-with-Ridge-and-Lasso-Regression.pdf
Machine-Learning-with-Ridge-and-Lasso-Regression.pdfMachine-Learning-with-Ridge-and-Lasso-Regression.pdf
Machine-Learning-with-Ridge-and-Lasso-Regression.pdfAyadIliass
 
An Analysis of Poverty in Italy through a fuzzy regression model
An Analysis of Poverty in Italy through a fuzzy regression modelAn Analysis of Poverty in Italy through a fuzzy regression model
An Analysis of Poverty in Italy through a fuzzy regression modelBeniamino Murgante
 
Measures of Similaritv and Dissimilaritv 65the comparison .docx
Measures of Similaritv and Dissimilaritv 65the comparison .docxMeasures of Similaritv and Dissimilaritv 65the comparison .docx
Measures of Similaritv and Dissimilaritv 65the comparison .docxalfredacavx97
 
Linear logisticregression
Linear logisticregressionLinear logisticregression
Linear logisticregressionkongara
 
Multicolinearity
MulticolinearityMulticolinearity
MulticolinearityPawan Kawan
 
Financial Risk Mgt - Lec 10 by Dr. Syed Muhammad Ali Tirmizi
Financial Risk Mgt - Lec 10 by Dr. Syed Muhammad Ali TirmiziFinancial Risk Mgt - Lec 10 by Dr. Syed Muhammad Ali Tirmizi
Financial Risk Mgt - Lec 10 by Dr. Syed Muhammad Ali TirmiziDr. Muhammad Ali Tirmizi., Ph.D.
 
Theory of linear programming
Theory of linear programmingTheory of linear programming
Theory of linear programmingTarun Gehlot
 
Stochastic modelling of the loss given default (LGD) for non-defaulted assets
Stochastic modelling of the loss given default (LGD) for non-defaulted assetsStochastic modelling of the loss given default (LGD) for non-defaulted assets
Stochastic modelling of the loss given default (LGD) for non-defaulted assetsGRATeam
 
Chapter6
Chapter6Chapter6
Chapter6Vu Vo
 
Introduction to Limited Dependent variable
Introduction to Limited Dependent variableIntroduction to Limited Dependent variable
Introduction to Limited Dependent variableAshok Dsouza
 
Developing Confidence Intervals for Forecasts
Developing Confidence Intervals for ForecastsDeveloping Confidence Intervals for Forecasts
Developing Confidence Intervals for ForecastsWalter Barnes
 
Credit Analogue (2003)
Credit Analogue (2003)Credit Analogue (2003)
Credit Analogue (2003)Texxi Global
 

Ähnlich wie Scheme on modeling dependence between financial assets (20)

Guide for building GLMS
Guide for building GLMSGuide for building GLMS
Guide for building GLMS
 
Machine-Learning-with-Ridge-and-Lasso-Regression.pdf
Machine-Learning-with-Ridge-and-Lasso-Regression.pdfMachine-Learning-with-Ridge-and-Lasso-Regression.pdf
Machine-Learning-with-Ridge-and-Lasso-Regression.pdf
 
An Analysis of Poverty in Italy through a fuzzy regression model
An Analysis of Poverty in Italy through a fuzzy regression modelAn Analysis of Poverty in Italy through a fuzzy regression model
An Analysis of Poverty in Italy through a fuzzy regression model
 
Measures of Similaritv and Dissimilaritv 65the comparison .docx
Measures of Similaritv and Dissimilaritv 65the comparison .docxMeasures of Similaritv and Dissimilaritv 65the comparison .docx
Measures of Similaritv and Dissimilaritv 65the comparison .docx
 
Linear logisticregression
Linear logisticregressionLinear logisticregression
Linear logisticregression
 
Modelo Generalizado
Modelo GeneralizadoModelo Generalizado
Modelo Generalizado
 
Multicolinearity
MulticolinearityMulticolinearity
Multicolinearity
 
Financial Risk Mgt - Lec 10 by Dr. Syed Muhammad Ali Tirmizi
Financial Risk Mgt - Lec 10 by Dr. Syed Muhammad Ali TirmiziFinancial Risk Mgt - Lec 10 by Dr. Syed Muhammad Ali Tirmizi
Financial Risk Mgt - Lec 10 by Dr. Syed Muhammad Ali Tirmizi
 
t1s1_pdf.pdf
t1s1_pdf.pdft1s1_pdf.pdf
t1s1_pdf.pdf
 
Theory of linear programming
Theory of linear programmingTheory of linear programming
Theory of linear programming
 
Stochastic modelling of the loss given default (LGD) for non-defaulted assets
Stochastic modelling of the loss given default (LGD) for non-defaulted assetsStochastic modelling of the loss given default (LGD) for non-defaulted assets
Stochastic modelling of the loss given default (LGD) for non-defaulted assets
 
Chapter6
Chapter6Chapter6
Chapter6
 
Introduction to Limited Dependent variable
Introduction to Limited Dependent variableIntroduction to Limited Dependent variable
Introduction to Limited Dependent variable
 
Developing Confidence Intervals for Forecasts
Developing Confidence Intervals for ForecastsDeveloping Confidence Intervals for Forecasts
Developing Confidence Intervals for Forecasts
 
Recessions.pptx
Recessions.pptxRecessions.pptx
Recessions.pptx
 
Telecom customer churn prediction
Telecom customer churn predictionTelecom customer churn prediction
Telecom customer churn prediction
 
Schema econ
Schema econSchema econ
Schema econ
 
Econometrics: Basic
Econometrics: BasicEconometrics: Basic
Econometrics: Basic
 
MF Presentation.pptx
MF Presentation.pptxMF Presentation.pptx
MF Presentation.pptx
 
Credit Analogue (2003)
Credit Analogue (2003)Credit Analogue (2003)
Credit Analogue (2003)
 

Mehr von Giulio Laudani

Coding and Equity pricing
Coding and Equity pricingCoding and Equity pricing
Coding and Equity pricingGiulio Laudani
 
Fixed income: Swaption, Futures, IRS, rate model
Fixed income: Swaption, Futures, IRS, rate modelFixed income: Swaption, Futures, IRS, rate model
Fixed income: Swaption, Futures, IRS, rate modelGiulio Laudani
 
Investement banking: M&A, Distressed asset
Investement banking: M&A, Distressed assetInvestement banking: M&A, Distressed asset
Investement banking: M&A, Distressed assetGiulio Laudani
 
Risk Management: Bank and Insurance
Risk Management: Bank and InsuranceRisk Management: Bank and Insurance
Risk Management: Bank and InsuranceGiulio Laudani
 
Bailout.consulting.brevini
Bailout.consulting.breviniBailout.consulting.brevini
Bailout.consulting.breviniGiulio Laudani
 

Mehr von Giulio Laudani (8)

Coding and Equity pricing
Coding and Equity pricingCoding and Equity pricing
Coding and Equity pricing
 
Fixed income: Swaption, Futures, IRS, rate model
Fixed income: Swaption, Futures, IRS, rate modelFixed income: Swaption, Futures, IRS, rate model
Fixed income: Swaption, Futures, IRS, rate model
 
Econometrics: Advance
Econometrics: AdvanceEconometrics: Advance
Econometrics: Advance
 
Investement banking: M&A, Distressed asset
Investement banking: M&A, Distressed assetInvestement banking: M&A, Distressed asset
Investement banking: M&A, Distressed asset
 
Risk Management: Bank and Insurance
Risk Management: Bank and InsuranceRisk Management: Bank and Insurance
Risk Management: Bank and Insurance
 
Financial accounting
Financial accountingFinancial accounting
Financial accounting
 
Bailout.consulting.brevini
Bailout.consulting.breviniBailout.consulting.brevini
Bailout.consulting.brevini
 
Tesi giulio laudani
Tesi giulio laudaniTesi giulio laudani
Tesi giulio laudani
 

Kürzlich hochgeladen

The Economic History of the U.S. Lecture 30.pdf
The Economic History of the U.S. Lecture 30.pdfThe Economic History of the U.S. Lecture 30.pdf
The Economic History of the U.S. Lecture 30.pdfGale Pooley
 
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service NashikHigh Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
VVIP Pune Call Girls Katraj (7001035870) Pune Escorts Nearby with Complete Sa...
VVIP Pune Call Girls Katraj (7001035870) Pune Escorts Nearby with Complete Sa...VVIP Pune Call Girls Katraj (7001035870) Pune Escorts Nearby with Complete Sa...
VVIP Pune Call Girls Katraj (7001035870) Pune Escorts Nearby with Complete Sa...Call Girls in Nagpur High Profile
 
Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...
Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...
Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...ssifa0344
 
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779Delhi Call girls
 
The Economic History of the U.S. Lecture 20.pdf
The Economic History of the U.S. Lecture 20.pdfThe Economic History of the U.S. Lecture 20.pdf
The Economic History of the U.S. Lecture 20.pdfGale Pooley
 
Instant Issue Debit Cards - High School Spirit
Instant Issue Debit Cards - High School SpiritInstant Issue Debit Cards - High School Spirit
Instant Issue Debit Cards - High School Spiritegoetzinger
 
The Economic History of the U.S. Lecture 22.pdf
The Economic History of the U.S. Lecture 22.pdfThe Economic History of the U.S. Lecture 22.pdf
The Economic History of the U.S. Lecture 22.pdfGale Pooley
 
Call US 📞 9892124323 ✅ Kurla Call Girls In Kurla ( Mumbai ) secure service
Call US 📞 9892124323 ✅ Kurla Call Girls In Kurla ( Mumbai ) secure serviceCall US 📞 9892124323 ✅ Kurla Call Girls In Kurla ( Mumbai ) secure service
Call US 📞 9892124323 ✅ Kurla Call Girls In Kurla ( Mumbai ) secure servicePooja Nehwal
 
The Economic History of the U.S. Lecture 21.pdf
The Economic History of the U.S. Lecture 21.pdfThe Economic History of the U.S. Lecture 21.pdf
The Economic History of the U.S. Lecture 21.pdfGale Pooley
 
The Economic History of the U.S. Lecture 23.pdf
The Economic History of the U.S. Lecture 23.pdfThe Economic History of the U.S. Lecture 23.pdf
The Economic History of the U.S. Lecture 23.pdfGale Pooley
 
Malad Call Girl in Services 9892124323 | ₹,4500 With Room Free Delivery
Malad Call Girl in Services  9892124323 | ₹,4500 With Room Free DeliveryMalad Call Girl in Services  9892124323 | ₹,4500 With Room Free Delivery
Malad Call Girl in Services 9892124323 | ₹,4500 With Room Free DeliveryPooja Nehwal
 
Independent Call Girl Number in Kurla Mumbai📲 Pooja Nehwal 9892124323 💞 Full ...
Independent Call Girl Number in Kurla Mumbai📲 Pooja Nehwal 9892124323 💞 Full ...Independent Call Girl Number in Kurla Mumbai📲 Pooja Nehwal 9892124323 💞 Full ...
Independent Call Girl Number in Kurla Mumbai📲 Pooja Nehwal 9892124323 💞 Full ...Pooja Nehwal
 
High Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
High Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur EscortsHigh Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
High Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
Dividend Policy and Dividend Decision Theories.pptx
Dividend Policy and Dividend Decision Theories.pptxDividend Policy and Dividend Decision Theories.pptx
Dividend Policy and Dividend Decision Theories.pptxanshikagoel52
 
(DIYA) Bhumkar Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(DIYA) Bhumkar Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(DIYA) Bhumkar Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(DIYA) Bhumkar Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
OAT_RI_Ep19 WeighingTheRisks_Apr24_TheYellowMetal.pptx
OAT_RI_Ep19 WeighingTheRisks_Apr24_TheYellowMetal.pptxOAT_RI_Ep19 WeighingTheRisks_Apr24_TheYellowMetal.pptx
OAT_RI_Ep19 WeighingTheRisks_Apr24_TheYellowMetal.pptxhiddenlevers
 
20240417-Calibre-April-2024-Investor-Presentation.pdf
20240417-Calibre-April-2024-Investor-Presentation.pdf20240417-Calibre-April-2024-Investor-Presentation.pdf
20240417-Calibre-April-2024-Investor-Presentation.pdfAdnet Communications
 
VIP Call Girls Service Dilsukhnagar Hyderabad Call +91-8250192130
VIP Call Girls Service Dilsukhnagar Hyderabad Call +91-8250192130VIP Call Girls Service Dilsukhnagar Hyderabad Call +91-8250192130
VIP Call Girls Service Dilsukhnagar Hyderabad Call +91-8250192130Suhani Kapoor
 

Kürzlich hochgeladen (20)

The Economic History of the U.S. Lecture 30.pdf
The Economic History of the U.S. Lecture 30.pdfThe Economic History of the U.S. Lecture 30.pdf
The Economic History of the U.S. Lecture 30.pdf
 
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service NashikHigh Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
 
VVIP Pune Call Girls Katraj (7001035870) Pune Escorts Nearby with Complete Sa...
VVIP Pune Call Girls Katraj (7001035870) Pune Escorts Nearby with Complete Sa...VVIP Pune Call Girls Katraj (7001035870) Pune Escorts Nearby with Complete Sa...
VVIP Pune Call Girls Katraj (7001035870) Pune Escorts Nearby with Complete Sa...
 
Veritas Interim Report 1 January–31 March 2024
Veritas Interim Report 1 January–31 March 2024Veritas Interim Report 1 January–31 March 2024
Veritas Interim Report 1 January–31 March 2024
 
Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...
Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...
Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...
 
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
 
The Economic History of the U.S. Lecture 20.pdf
The Economic History of the U.S. Lecture 20.pdfThe Economic History of the U.S. Lecture 20.pdf
The Economic History of the U.S. Lecture 20.pdf
 
Instant Issue Debit Cards - High School Spirit
Instant Issue Debit Cards - High School SpiritInstant Issue Debit Cards - High School Spirit
Instant Issue Debit Cards - High School Spirit
 
The Economic History of the U.S. Lecture 22.pdf
The Economic History of the U.S. Lecture 22.pdfThe Economic History of the U.S. Lecture 22.pdf
The Economic History of the U.S. Lecture 22.pdf
 
Call US 📞 9892124323 ✅ Kurla Call Girls In Kurla ( Mumbai ) secure service
Call US 📞 9892124323 ✅ Kurla Call Girls In Kurla ( Mumbai ) secure serviceCall US 📞 9892124323 ✅ Kurla Call Girls In Kurla ( Mumbai ) secure service
Call US 📞 9892124323 ✅ Kurla Call Girls In Kurla ( Mumbai ) secure service
 
The Economic History of the U.S. Lecture 21.pdf
The Economic History of the U.S. Lecture 21.pdfThe Economic History of the U.S. Lecture 21.pdf
The Economic History of the U.S. Lecture 21.pdf
 
The Economic History of the U.S. Lecture 23.pdf
The Economic History of the U.S. Lecture 23.pdfThe Economic History of the U.S. Lecture 23.pdf
The Economic History of the U.S. Lecture 23.pdf
 
Malad Call Girl in Services 9892124323 | ₹,4500 With Room Free Delivery
Malad Call Girl in Services  9892124323 | ₹,4500 With Room Free DeliveryMalad Call Girl in Services  9892124323 | ₹,4500 With Room Free Delivery
Malad Call Girl in Services 9892124323 | ₹,4500 With Room Free Delivery
 
Independent Call Girl Number in Kurla Mumbai📲 Pooja Nehwal 9892124323 💞 Full ...
Independent Call Girl Number in Kurla Mumbai📲 Pooja Nehwal 9892124323 💞 Full ...Independent Call Girl Number in Kurla Mumbai📲 Pooja Nehwal 9892124323 💞 Full ...
Independent Call Girl Number in Kurla Mumbai📲 Pooja Nehwal 9892124323 💞 Full ...
 
High Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
High Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur EscortsHigh Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
High Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
 
Dividend Policy and Dividend Decision Theories.pptx
Dividend Policy and Dividend Decision Theories.pptxDividend Policy and Dividend Decision Theories.pptx
Dividend Policy and Dividend Decision Theories.pptx
 
(DIYA) Bhumkar Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(DIYA) Bhumkar Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(DIYA) Bhumkar Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(DIYA) Bhumkar Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
OAT_RI_Ep19 WeighingTheRisks_Apr24_TheYellowMetal.pptx
OAT_RI_Ep19 WeighingTheRisks_Apr24_TheYellowMetal.pptxOAT_RI_Ep19 WeighingTheRisks_Apr24_TheYellowMetal.pptx
OAT_RI_Ep19 WeighingTheRisks_Apr24_TheYellowMetal.pptx
 
20240417-Calibre-April-2024-Investor-Presentation.pdf
20240417-Calibre-April-2024-Investor-Presentation.pdf20240417-Calibre-April-2024-Investor-Presentation.pdf
20240417-Calibre-April-2024-Investor-Presentation.pdf
 
VIP Call Girls Service Dilsukhnagar Hyderabad Call +91-8250192130
VIP Call Girls Service Dilsukhnagar Hyderabad Call +91-8250192130VIP Call Girls Service Dilsukhnagar Hyderabad Call +91-8250192130
VIP Call Girls Service Dilsukhnagar Hyderabad Call +91-8250192130
 

Scheme on modeling dependence between financial assets

  • 1. Giulio Laudani Cod. 20263 Scheme on CDS and Copula Empirical data features: ____________________________________________________________________ 2 Univariate world:________________________________________________________________________________ 2 Multivariate world: ______________________________________________________________________________ 2 Multivariate insight: ______________________________________________________________________ 2 Measure and test _______________________________________________________________________________ 2 Dependence Measure: ____________________________________________________________________________________ 2 Test on distribution: ______________________________________________________________________________________ 4 Gaussian multivariate correlation __________________________________________________________________ 4 Other multivariate specification ____________________________________________________________________ 4 Copula function __________________________________________________________________________ 5 What is a copula e when it is suitable _______________________________________________________________ 5 Sklar’s theorem [1959] ___________________________________________________________________________ 6 How to generate a Copula: ________________________________________________________________________ 6 Type of copula function __________________________________________________________________________ 7 Implicit types: ___________________________________________________________________________________________ 7 Explicit types: ___________________________________________________________________________________________ 7 Meta-distribution:________________________________________________________________________________________ 7 Risk modeling ____________________________________________________________________________ 8 Goals and features: ______________________________________________________________________________ 8 Approaches: ____________________________________________________________________________________ 8 Comments and further details: _____________________________________________________________________ 9 Pricing Multi-asset derivatives _____________________________________________________________ 10 The BL Formula _________________________________________________________________________________________ 10 The Practitioners’ Approach: ______________________________________________________________________________ 10 The copula approach: ____________________________________________________________________________________ 10 Credit Derivatives focus: _________________________________________________________________________ 10 Market practice: ________________________________________________________________________________________ 11 Pricing a CDS: __________________________________________________________________________________________ 11 CDS typologies and CDO financial product: ___________________________________________________________________ 14 1
  • 2. Giulio Laudani Cod. 20263 Empirical data features: We are going to use as an explanatory model of return evolution the one that is assuming the presence of two compo- nent: the permanent information and the temporary noisy one. The second one prevails in the short term high fre- quency observation, while the first emerges for longer horizon. This property implies: In general, predictability of returns increases with the horizon. The best estimate for the mean of returns at high frequency is zero, but a slowly evolving time varying mean of returns at long-horizons could and should be mod- eled. [there is a strong evidence of correlation between industrialized countries stocks] If we run a regression we are expecting high statistical significance for the parameters for log horizon The presence of the noise component in returns causes volatility to be time-varying and persistent, and the an- nualized volatility of returns decrease with horizon Data shows non normality behavior, unconditional distribution with higher tails Non-linearity is also a feature of returns at high frequency: a natural approach to capture nonlinearity is to diffe- rentiate alternative regimes of the world that govern alternative description of dynamics (such as level of volatil- ity in the market), for example Markov chain. Univariate world: Those time series have little serial correlation, high absolute return correlation, excepted conditional return are close to zero, volatility appears to change over time, leptokurtosis and skewness are main features and extremeevent ap- pear in cluster (thus it can predict), lastly long term interval converge more to Gaussian hp Multivariate world: Little correlation across time, except for contemporaneous returns, strong correlation for absolute returns, correla- tion[to compute the correlation we should fit different models for changing correlation and make a statistical compari- son] vary over time (and clustering effect), extreme returns in one series are correlated with other returns (extreme dependence) Multivariate insight: This topic is relevant for pricing. Asset allocation and risk management issues. When we dealt with multi securities we need to model not only the dynamics of each securities, but also the joint evolution, i.e. their dependence structure. Before starting to describe the first approach we need to provide some notation that will be used here after: Consider a d-dimensional Vector X the joint distribution is The marginal distribution will be If the [marginal distribution] is continuous we can obtain the densities by computing the first derivatives, so the joint densities will be the non-negative function Note that the existence of a joint density implies the existence of all the marginal densities but not vice-versa, unless the components are orthogonal one to the other Measureand test Dependence Measure: The Linear correlation is the natural dependence measure only for multivariate normal (it will coincide with the maxi- mum likelihood estimators, hence it will satisfied all the desirable properties) and when the variance is (must be) a finite number (not obvious, insurance business). It depends on marginal distribution (since it is computed with moments, 2
  • 3. Giulio Laudani Cod. 20263 Hoffding formula )), hence it is not invariant under more general transformation (it is still invariant under liner transformation) and it wouldn’t capture the correlation dynamic under more generic func- tion. The Rank Correlations are dependence measure that depend only on the copula and not on marginal, so it is invariant under general increasing transformations. To compute the rank correlations we need to know the ordering1 of each va- riable, not the actual numerical value ( we are assuming continuous margins for simplicity) : there exist two measures: Kendall’s tau, we compute the number of “c” concordant and “d” discordant pair between two variables with n variables. sample version. The population version simply consider the probability of concordance minus the probability of discordance , where the two are vectors with iid distribution Spearman’s rho [sample version], we need to compute the rank variables (the same time series are ordered) of the two variables and then compute the correlation between these new variables . The popula- tion version is the correlation of the marginal value Here in the following we will provide a proof on the dependence of the estimators only on copula distribution function: The Kendall’s rho , since the sum of the two is one, we can write the one as a function of the other . We know that since the two variable are i.i.d. this last term is the joint distribution, which can be expressed as an integral form thanks to the Sklars’ theorem we can rewrite the previously equation as function of copula distribution (by change the integral interval with 0,1). hence we have proven that the estimator depends on . The Sperman’stao population estimators is . The covariance formula is , hence we can rewrite it as a function of copula . The so if we put together all this findings we ends up with There exists also a multivariate version of the rank correlation measure, where instead of taking the expected value (population version) we will take the “cov” function or correlation matrix depending if we were using the Kendall or Spearman measure, hence the estimator will be at least semi positive definite. Both the measure can assume the value in the interval [-1,1] and 0 is independent. The tail dependence measure has been set to judge the extreme dependence between pair of variable (it is hard to ge- neralize to the d-dimensional case), the idea behind is to limit conditional probabilities of quantileexecedances. The up- per measure is given by and the value are in the interval [0,1], the opposite for the lower tail . Those formula can be written s function of copula (this measure depends only on copula, hence it is a proper dependence measure), by applying the Bayes theorem 1 When we refer to the ordering of a variable we need to introduce the concept of Concordance and Discordance. In the first case it is meaningful to say that one pair of observations is bigger than the other (without use the Probability distribution) 3
  • 4. Giulio Laudani Cod. 20263 Some example are provided: the following Gaussian is asymptotically independent in both tails, t-copula has a symme- tric (thus not good for real data features) dependence , where higher value of v degree of free- dom will bring an higher tail dependence. The Gumbel copula has upper tail dependence , and the Clayton copula has lower tail dependence is Test on distribution: To test univariate distributionhp we can use the QQ plot technique to have a graphical sense of the validity of the hp. To have a numerical sense we can use the Jarque-Bera test which is based on the joint check of the skewness and kurto- sis of the sample. To test multivariate distribution we need some other tools, specifically set up. The first is based on a quadratic form where we will compute for each observation the , which should be distributed according to a chi-square with d degree of freedom , which can be seen graphically with a QQ plot or numerically with the Marda Test(by using the D, third and fourth moments, the first distribute as a chi-square with degree of freedom [ ], the second follow a standard Gaussian) which is basically a ri-proposition of the jarque-Bera idea, however it is not a joint test, since we test the skewness and kurtosis separately. Gaussian multivariate correlation The first and most trivial measure of dependence is the correlation or equivalently the variance covariance matrix. It is at least a semi-definite matrix, and if it is positive-definite it is possible to apply the Cholesky decomposition. It is usually computed with the sampling estimators2. The Gaussian distribution is the most simple hypothesis, it is characterized by the first two moments, which will entirely describe the dependence structure, linear combination remains normally distributed and it is easy to aggregate different distributions (we just need to know the correlation between them). There exist several closed formula under the geometric Brownian motion in the derivatives pricing and it is simple to manage the risk by using the VaR approach.The Gaussian world is unfeasible with the empirical evidence, it has a poor fit with real data3, hence we need to develop a new set of model which allow for fat-tail and more flexibility in defining the dependence structure. Note that a multivariate set of variable to be jointly Gaussian, each of the univariate distribu- tion must be Gaussian if self. Other multivariate specification The first attempt to solve the problem was to work with the conditional distribution, such as GARCH model , where we are hoping that the rescaled return by volatility is normal. Even if those model performs better than the clas- sical one, we still need something more complete model, furthermore it is a really demanding specification for multiva- riate needs4. The second attempt was to use different distribution with fatter tails, such as the t-Student’s 2 We are implicitly assuming the each observation is independent and identically distributed, in fact only in this case this estimator will coincide with the maximum estimators and have all the desired properties, furthermore this estimator will depend on the true multivariate distribution 3 We observe fat tail both in the marginal distribution and in the joint distribution, furthermore the volatility seems to move in clus- ter, while return are less time dependent, and correlation vary over time 4 We need elements for each securities in the matrix to be modeled, where d is the number of coefficient in the algorithms specification. The same problem for the ETV 4
  • 5. Giulio Laudani Cod. 20263 tion5(note that each marginal distribution must be t-student with same characteristic) or more advance parametric spe- cification: Normal Inverse or Levy process. The third possibility was to model just the extreme distribution behavior EVT, which is very popular and useful in the risk metrics word, less when we need to price (in those cases we need the whole distribution) A totally different approach to the problem was to the dimension reduction techniques, the general idea is based on the empirical observation that a limited number of common factors explain most of the dynamics. The simplest way to achieve such a solution is the PCA (here the Cholesky decomposition plays the role). To conclude all of those proposal model a joint distribution without any insight on the marginal distribution (or a least do not allow them to be different form the one accepted by the joint distribution chosen), hence it isn’t a flexible ap- proach. It will by far more interesting to try to put together different marginal distribution which will properly fit the empirical data and then set the joint distribution, like in a building up approach; this is the Copula method which will be treated in the following section. Copula function This approach is preferable for its higher flexibility: the bottom-approach allows to independently control/specify an ap- propriate distribution for each factor and the joint dependence structure. Other important properties are independent to scaling effect since it works on quantile, it is invariant to strictly positive transformation. What is a copula e when it is suitable A d-dimensional copula is a distribution function on the unit hypercube with standard uniform marginal distribu- tions. We denote copulas as and it must follows those properties (otherwise we cannot specify the copula): 1. is increasing in each component.to be proven you have to perform the first derivatives 2. If at least one component u=0 the Copula function must be equal 0 3. meaning that if each other marginal function is realized the copula function result will the i- esimomarginal distribution. To prove, change the other variable with 1 and let the variable to change. 4. The rectangle inequality for all with o In a bivariate case it simply means that for every a1 < b1 and a2 < b2 we have: C(a1, a2) − C(a1, b2) − C(b1, a2) + C(b1, b2) > 0 Properties (1) and (3) are required to have a multivariate distribution function, and to ensure that the marginal distribu- tions are uniform. Property (4) denotes that the copula function is d-increasing. If a function C fulfills these properties, it is a copula6. 5 Since it is an elliptical distribution it enjoys most of the Gaussian properties. The most important which is shared among all the el- liptical distribution family is that the normalize value is distributed v= degree of freedom and d= # securities, i.e. depends only those two values. Furthermore it is closed under convolution (if two variables has the same dispersion matrix we can add them to obtain a new distribution, by assuming independency) Note that to fit fat tail distribution we may ends up with few degree of freedom 66 Also, for 2 6 k < d, the k-dimensional margins are themselves copulas. 5
  • 6. Giulio Laudani Cod. 20263 Sklar’s theorem [1959] This famous and import theorem states that Copula can be extracted from a joint distribution with known margins and that the copula uniqueness is granted if marginal distribution are continuous, vice versathe space of existence of the copula is the multiplication of the marginal distribution range. The proof relies on the generalized inverse theorem. The first part of the proof is, by assuming the existence of the joint distribution function with margins , then exist a Copula , where (by assuming continuous margin), by substituting we have . The second part given a Copula (unique) with given margins, we want to prove the existence of joint distribution with the previously defined margins. 1. We take the vector U with the same distribution of the Copula 2. Then we compute the variable X defined as , where the F are the inverse marginal distri- bution (used to build the Copula). 3. We then write the distribution function of the variable X this last term shows the equivalency of the distribution X (joint distribution) to the Copula, if the margin are assumed to continuous. Another important result of the theorem is that Copula could be extracted by multivariate joint tion .Copula is bounded between the Fréchet limit: countermonotocity ( ) [it is a copula only if d<2] and comonotocity , which are two copula function itself. How to generate a Copula: Structural model proposes as correlation the asset correlation, basically the equity correlation. Intensity based approach propose to use for a given time horizon , unfortunately not enough data available for joint obser- vation. Thus has been proposed a MC algorithm called Li’s model (2000), where by using the hazard rate bootstrapped from CDS term structure and a given correlation taken from asset correlation, however it is computationally intense since default events are rare. If we start form a given copula distribution (the first example will be a Gaussian one) We need to decompose via Cho- lesky decomposition the correlation matrix (we need to specify a procedure to produce this matrix). We need to run d-random variable Y and multiply them with A to obtain the variable Z (where A will contribute to give the corre- lation structure) we will use this variable Z to compute our distribution given a chosen Gaussian copula. To run a t- copula we need to add a step, basically we need to multiply the variable Z by where v are the degree of free- dom and s is a random variable generated by a chi-square distribution with “v” degree. How to fit a copula on empirical data: A different problems to fit empirical data into a proper copula dependence structure specification, there exist several methodologies available: Full maximum likelihood where we will estimates both the copula and margins parameter all together, this me- thod is of course the more accurate and unbiased, however it is tremendously intense. IFM or inference function for margins, where we specify a parameter marginal distribution, which will be used to compute the cumulative probability of the empirical data set, then we will used those probability to run a 6
  • 7. Giulio Laudani Cod. 20263 maximum likelihood to estimate the copula parameters. This method highlydepends on the marginsspecifica- tion, hence it is quite instable Canonical maximum likelihood is similar to the previous one, however we are going to use empirical marginal distribution, we are neutral on this side, while we are more focus on the joint dynamics Method of moments (used by MatLab), basically we are going to use the empirical estimates for some rank cor- relation measure, which is assumed to be our lead indicator/driver Type of copula function All those copula function can be expressed in term of survival probability, they are named survival copula. An index of the flexibility of the copula dependence structure is the number of parameter in the algorithms, the most flexible pre- sented is the t-copula (degree of freedom and the correlation) Implicit types: In this class of copula belongs all those classes which do not have an explicit formula. The Gaussian [], where the inde- pendent and the comonoticity , defined only for the bivariate caseare special case of the Gaus- sian for correlation value equal to 0 or 1. It depends on the correlation variable only, which is a matrix. The t-student is another example. Note that in this case we do not require the margin to follow a specified distribution Explicit types: Closed formula are available for this class of copula. The Fréchet copula is the combination of the three fundamental copula: independent and the two limits (each of those is an explicit copula). It is a sort of average with coefficients β, α.We can obtain copula as a linear combination of this fundamental copula as follow 1,…+ ( 1,…), the beta and Alfa describe the dependence structure Belonging to the Archimedean class we have:Gumbel Copula defined by the equation Where the variable is the only parameter (not really flexible), it is bounded between 1 and infinite. At extreme it converge to the independent and to comonotonicity copula. Clayton Copuladefined by the equation Where the variable is the only parameter (not really flexible), it is bounded between 0 and infinite. At extreme it converge to comonotonicity the and to independent copula Meta-distribution: In this case we not directly apply the Skalar’s theorem. At first we will generate the random variable U (cumulative probability) form a specified copula function (which is the one that we want to use). We will use those generate va- riables into a specified marginal inverse distribution to generate our random variables. Basically we have run the apply the inverse procedure form the top to the bottom An example is the Li’s model where Gaussian copula is used to joint together exponential margin to obtain a model for the default times of copula when these default time are considered to be correlated. 7
  • 8. Giulio Laudani Cod. 20263 Risk modeling Goals and features: As a risk manager we are interested in the loss distribution 7function defined (in general) as where the randomness come into with the variable “t”, basically the loss is function of time. Generally speaking we are interest in the function which defined the capital absorbed (regulators). Those numbers may be used for: Management tool, capital requirement and performance measuring adjusted by risk. to perform those tasks we need a model to predict asset value evolution, the most used one is a factors model mapped on specific securities’ features. However this approach has the limit to assume a constant composition of the portfolio and the prediction is made by a linearization approximation, which might work only for small factor changes and a negligible second derivatives effect. Risk management may use unconditional or conditional distribution, where the later are preferable to assess a dynamic environment, where new information come into through time. The unconditional distribution are used for time interval greater than four months, since the time dependence of the return start to be negligible. It is a crucial task to under- stand the data, needs and tools available in the daily RM routine. Approaches: Notional is the most used by regulators, since is very simple. The general idea is to weight the entity asset by specific risk level. This method has severe limits since there is no diversification or netting benefits and to value derivatives position it is wrong to account the national value since the transaction is anchor to a specific payoff. Factors model follows a delta-gamma approach. It is not comparable across different asset class, it doesn’t allow to con- sider overall risk (you need to model the correlation of each factors, no sense in put together delta and Vega effect), however it performs well when judging well specified event The most used Loss distributionmethod is the VaR approach.For a given portfolio, confidence level and time horizon, VaR is defined as a threshold value such that the probability that the mark-to-market loss on the portfolio over the given time horizon exceeds this value (assuming normal markets and no trading in the portfolio) is the given probability level. The limits are: we do not have a clear and fully agreed parameters calibration 8confidence level an time horizon, that’s why the regulators force you to use a one day @ 0.99 parameters), it is not always sub additive, this problem get more and more severe when we dealt with non-elliptical distribution, it didn’t give us any insight on the tail (after) possible worst outcome loss The ES will overcome those issues (It is a coherent risk measure), however is tough to be computed, one solution is to slice the tail distribution into equal probability occurrence, compute the VaR and make the mean, however for non- parametric distribution the observation might be two few. This method can be inferred from parametric, historical and MC distribution, however they are intensive on the computational side. Market risk Into the credit risk definition we have the Credit risk itself, migration risk and o Riskmetrics, where we will generate a whole migration matrix to be used 7 Our interest in loss is related both to the job performed by risk management, and to the relevance and well defined academic modeling 8
  • 9. Giulio Laudani Cod. 20263 o Credit view idea is to use macroeconomic prospective to correct the risk measurement, note that if the time horizon used is a point in time there is no need to adjust by economic cycle, in this case there would be a double counting Scenario based approach suffer of misspecification, it is good only for well-defined problem (factors) and it is not easy to compare with multi asset securities. Variance was used as a risk measure, however it highly depends on the hp distribution (only symmetric, since deviation are equally weighted), should exist. Possible solution are partial moments, which is the base idea of expected shortfall. Comments and further details: Backtesting is really important. The idea is to check how many time the measure fail over time, it is a pointless/daunting task to run Backtesting on ES, since errors on VaR are few and average reduce the effect. The scaling problem is undertake with overlapping or with separated time interval. The first will allow to do not lose da- ta point, but it introduces serial correlation on the data, the second reduce the data that can be used. The convention to rescaling by multiplying with a time factor is correct only if the data are iid, that is not the case The stress test must always be undertaken by risk manager this technique is a perfect complementary tool to track the portfolio risk dynamics. Any risk measure should fit the coherent risk measureproperties: Some comments on the major characteristic of credit models: Default-mode versus multinomial one, where only credit risk + belongs to the first Future values vs. loss rate, meaning that the model can be based on the distribution of possible value or possi- ble future loss, the first use as input the spread curve by maturity, while in the second the spread it is not neces- sary to be known. Creditmetrics is a typical market value model, while credit risk + loss one Conditional vs. un-Conditional, portfolio views belong to the first, however this distinction is useful only if the model works through the cycle Monte Carlo vs analytical solution Asset correlation vs. default correlation, it’s less important than other, in fact they are close to each other. Cre- ditmetrics is belongs to asset correlation, while credit + to the second one. 9
  • 10. Giulio Laudani Cod. 20263 Pricing Multi-asset derivatives The finance cornerstone in pricing any derivative is , where our goal is to com- pute a meaningfully and arbitrage free measure of the function f(x). The multi-asset derivatives family is the one where the payoff depends on more than one underlying. CDS by itself is not a multi-asset derivatives, however there exist in the market some example of derivatives protection against default of more than one underlying. The most used multi-asset payoff types are: Best of, Worst of and Basket option. There exist three possible framework to be used to price those derivatives, in the following section there will be a brief description of all of them, however our focus will be set on the Copula one. The BL Formula In this set of hypothesis the motion used is the multivariate geometric Brownian one, where the correlation between asset is modeled into the stochastic (diffusion) component by the following relationship . In this framework we can obtain a closed formula, however the distribution is lognormal and the parameters are assumed to be constant, hence the risk-neutral density for the log returns is normal, not really feasible. The Practitioners’ Approach: It is usually introduce some more sophistication into the pricing formula, such as a stochastic volatility and jump diffu- sion process for individual random variable. At first we need to calibrate the models’ parameter for each asset and then we will estimate the correlation matric via historical data. The last stage is to use a Monte Carlo simulation techniques to compute the price. This approach is really flexible and consistent (Rosenberg states that it is meaningful to mix an objective dependence structure with univariate distribution) with univariate pricing, however the use of the linear correlation estimators is a poor fit to describe dependence structure. The copula approach: Given a copula function we can easily compute the joint density function as where the right term is simplythe joint derivatives of the copula function . This approach is an extension of the previously one, since we are modeling more sophisticated dependence structure via copula. With those information we can recall the pricing formula of an European derivatives with general payoff , by recalling that the joint density is a function of the marginal densities (easily computed form market data) and risk neutral copula (not easy to extract form market data), however it has been proven that historical one is equal to risk neutral one under fairly general hp, which is to use affine transformation. This approach is really flexible, suitable for both parametric and non-specification and furthermore can be extended to add more sophisticated dynamics. Note that we are still assuming a static approach, where the parameters are not let to change over time, and the information derived from historical behavior. Credit Derivatives focus: Within this class of derivatives securities belong any instrument that enables the trading/management of credit risk in isolation from the other risk associated with the underlying. Those instruments are traded over the counter, hence it is very flexible, however they have been standardized in the recent year to enhance the liquidity. 10
  • 11. Giulio Laudani Cod. 20263 They have been used as risk management tools and as a trading tool to synthetically take exposure on creditworthiness, in 2012 the whole CDS market (single and multi-name) has a market value equal to 1.666$ billion, this market account for 72% of the whale derivatives notional amount. Regulators have add new requirement for accounting/capital requirement for those classes of assets, CVA (risk of coun- terpart). The elements of this “new” risk are: it is a highly skewed for protection buyer, the buyer riskiness should take into account (bilateral), the maximum loss is the MTM at default. Possible solutions to hedge this risk are: netting, diver- sifying the counterpart (not feasible for CDS derivatives), posting collateral (illiquid, require time to acquire) or clearing house (pricing of those instruments will account for cash). It may be priced we will use the present value of swaption for each nock (default time) multiplied by the probability of default bootstrapped from CDS. [with structural or reduced ap- proach] this solution is justified by the economic intuition of the kind of risk undertaken by one considering the position; basically a default of the counterpart will cause a loss on a random time in the future of a given amount we can hedge it by buying a swaption so that a random date into the future we will have the option to protect our self against this credit event. This kind of risk is defined as a Wrong risk, since it will always move against. Market practice: CDS is an insurance contract through which one party provides protection (protection seller) against credit risk of sin- gle/multi entities in exchange of periodic premium payments. These premium are paid on a regular basis (normally quarterly) until the credit event occurs or till the CDS expiry8. The spread traded in the market is the annualized premium under the convention act/360, expressed in basis points. The recent convention is to quote at standard spread (100bp for investment and 500bp for junk bond, US convention) (European convention has more than two nocks[]) corrected with upfront payment which varies according to the default risk. In the contract it must be clearly specified which is the credit event considered as a liable at starting date by the protection seller, furthermore it must be clarify if there will be a cash settlement9or a physical settlement10, which is the most common11.Possible events are bankruptcy, failure to pay or any restructuring event.CDS starts usually 3 days after the contract date, spot contract, if longer it will be a forward starting CDS, when the protection seller receive the notifi- cation it has 72 days to deliver the Cheapest to deliver securities Pricing a CDS: CDS market is so liquid that the pricing is led by demand and supply, however there exist two main pricing formulas needed to price more exotic payoff, for unwinding12 existing CDS positions or in general when we need to assess the marked to market value of existing CDS position. To perform those tasks we need to specify a model we cannot simply take the difference between spread, since the credit event is random. The whole set of possible models available to price a CDS need to properly define the default probability of both the reference entity and protection seller, their correlation; any cheapest to delivery option granted, the maturity of CDS and the expectedrecovery rate. 8 The most liquid maturity is the 5 years, followed by the 3,7 and 10 years 9 The seller pays the buyer the face value of the debt minus the recovery rate calculated on the basis of dealer quotes 10 The buyer delivers any deliverable obligation of the reference entity to the seller in return for the notional amount in cash 11 There exist default digital swap agreement, where the amount paid by the seller is pre-determinate 12 To close an existing position there could be three possible solutions: pay any difference to the originator, sell the position to another counterparty (so the buyer is replaced) and entering in an offsetting transaction with another counterparty 11
  • 12. Giulio Laudani Cod. 20263 The first model is the Asset Swap Approach, which intuition lie on the possibility to replicate the CDS payoff with other trader securities, i.e. an interest swap agreement plus buying/selling the underling bond financed/invested in the repo market. This approach may sound equal to the CDS poor strategy, however the position on the IRS is not automatically cancelled in case of default13, that why there is a basis between the two strategies which is equal zero in the long term (average). Some hedge fund will tray two trade on the two market to make money out of it, by believing in a sort of re- verse behavior in the long run. The second one is the Default probability models family, which is divided into structural models and intensity based/reduced form models14. The first try to estimate the default as a consequence of the reference entity characteris- tic, limits are that the data (usually accounting based) are not really reliable/transparent, it is not enough flexible to fit the market structure (there is no fitting procedure, we should believe that our model deliver the same result of the market (no boundary to constrain the model)) and it is not easy extendable to price CDS. The second focused on model- ing the probability of the credit event itself, those models are extendable, flexible and the most used one (Tumbull & Jarrow, 1995). Those last two models consist on finding into thebreak even CDS spreadequation (latterly defined) the survival probabil- ity. The break even CDS is found by exploiting the relationship between the premium and protection leg, which must be equal at origination. Hereafter there are the formulas used to value the two positions.We need to find out the Market to Market value of the protection and premium legs: The MTMis equal to [if we are long, negative sing the short position[where the last term is the expected value of 1bp premium till default or maturity, whichever is the sooner. To assess the value of the RiskyPV01 we need to understand how we can offset the position: o Ask the original counterpart to close the position or be replaced by other market participant, in this case all the P&L is taken immediately o Create another strategy that allow to receive the spread difference, in this case the P&L is taken during the remaining life, and the payoff is random, since if the default will came before maturity the flow will end. Note that there won’t be any default risk since the deliver if offset The PV01 must be NA and we need to take into account the riskiness/occurrence of each premiums The Model Desirable properties of the model are: capture the credit risk of the reference entity; model the payment as percent of FV, modeling the timing, be flexible ad NA and simple, given the respect of Bid-ASK constrain(model value are within that spread) and by controlling for the time burden. The present value of the premium leg will be PMLPV + PVAP, where where is the contractual CDS spread, is the day count fraction, is the Discount factor and is the arbitrage free survival probability from 0 to the N payment date. This formula is ignoring the effect of the premium accrued15 form last premium payment to credit event, to consider this effect the formula will be 13 Furthermore there other factors to be considered: [Fundamental] different payoff, different liquidity, different market participant and possible friction in the repo market; [Technical] cheapest to deliver option, counterparty risk differential, transaction cost to package the asset swap 14 The difference is in how the probability measure in the formula has been computed 15 If the accrued premium is not considered the value of CDS is always higher, this effect depends on the RR and default frequency, if we assumed a flat structure the effect is represent as follow 12
  • 13. Giulio Laudani Cod. 20263 , which is hard to be computed, so we will approximate it with which is an average. The present value of the protection leg16 is , we are assuming that the payment is done as soon as the notification is sent, the formula’s idea is to compute the probability of each payment for any time interval (infinitesimal), it is simplified with a discrete proxy, where twelve nocks per year is a good proxy17.The CDS Breakeven spread will be . This equation have more variables than equation, so we will use an iterated process (bootstrapping approach) form short maturity to longer one. The []bootstrapping number of variables The RR is computed with two possible approaches: use rating or Bond price. The first one has limits related with the rules used to compute rating (American biased, different definition of default).The second is to extract info form bond price, however it is not good for good name, where the effect of the RR is negligible The intensity-based model consists on modeling the probability of credit events itself as a certain jump process, like a Poisson counting process or more sophisticated one, where the credit event is the jump occurrence. This class is elegant, very easy to be calibrated, however there is no clues on what is driving the default, so we cannot know how far the company is from the credit event. By calling with the intensity measure we can assume COX process to model how it varies as new information arrives(continuous) from the market, so that the survival probability will be , or alternatively we can use a time varying deterministic intensities where the lambda is made piecewise constant and the survival probability or lastly by assuming that the lambda is constant through time and independent from RR and interest rates, i.e. the hazard rate term structure is flat for all maturity. When we have decided the term structure of lambda we can compute the CDS spread, on the other hand from the market CDS we can bootstrap a term structure of hazard rates and survival probabilities. The two authors used a Poisson counting process to predict credit event which occur at time τ , so the probability measure depends on a time dependent function (hazard rate) and time interval. The hazard rate are usually assumed to be deterministic. The hazard rate risk neutral are higher than the historical one, since the first will account for more factors (liquidity for instance). Note for inverted curve it could be a case to obtain negative hazard rate, this may be an arbitrage (model dependent or independent) Our model needs, to compute the hazard rate, the interest rate structure and a sound estimates of RR, we also assumed to deal with continuous function, discretized to ease the math. The structural models are based on the general idea that the firms’ value follows a stochastic process and when this value goes below a certain amount the default occurs. This class is intuitive and linked to tractable value, however we need accounting data18. The most simple example is the Merton model (1974), where the CDS is the overhead of the effective return on the investment and the significant risk free. 16 We were assuming that the number of possible credit event is finite, in this case one per month 17 The proxy is r/2M difference, assuming flat structure 18 Recent papers show that it is not consistent to really upon balance sheet data, since they are dirty measure and may not consider real transaction and data. Those papers had shown a superior information ratio from rating agency outcome 13
  • 14. Giulio Laudani Cod. 20263 The passage to arrive to this relation is to set the company values as the sum of Equity (which is described as a call) and the debt position which is , where “r” is the risk free since we are hedging our position with the short put. Economic principles told us that the risk free position made by shorting the put is equal to the risky present value of the notional debt amount, by exploiting this we will compute the risky return as function of the previous findings. Note that the volatility value is the implied value of the stock market value (reversing the BL formula) This model is over-semplicistic, in fact it has a poor fit, especially for investment grade bond or low spread period, the academia has come out with lots of possible extension. The structural model allow to find a “fair CDS” to be compared to the market data, so that to start possible trading strategy, the name of those strategies is relative value. CDS typologies and CDO financial product: A basket or Index CDS is built by considering the most liquid securities and dividing into investment grades nocks (for example the iTraxx 125 is a proxy of the best 125 European and Asian companies, in North America and emerging mar- ket we have the dow jones CDX). Those index are highly standardized: Every six month a new index is launched (20 march and September), hence on the run index are most liquid one (note that the index will continue to exist till all the company into the basket will default), those index are divided into sub sector activity and credit worthiness (investment, cross over and High yield [typically the lower grade considered is B]). If one company default the buyer will continue to pay the premium on the remaining notional and he will receive the n-defaulted notional. Each reference entity is equally weighted, hence the premium paid is close to an equally weighted average of the single name CDS. The payment is settled every quarter and the difference between index spread and single name average is called index skew and it is due to different liquidity in the instrument, where the first has an higher one. Index are more used since they have lower bid ask spread, diversification, standardization.The correlation won’t pay any role since premium and payment depends sole on each underlying asset The CDO securities are bonds issued on basket of other asset which benefit the cascade or trenching credit enhancing, to price this securities we need to define a loss function. Related to the previous point there are the definition of the tranche default probability , where the first is the cumulative loos function and the second is . Basket securities may have a n-to default optionality, where the pro- tection seller is liable to pay only when the n reference entity has defaulted. The CDO pricing is We can extract the implied correlation used to price the CDO, it is suggested a one factor model with LHP approach [].To bootstrap the implied correlationcompound and based correlation [] The first is assuming a flat correlation that reprise each tranche to fit market value (we can use historical data), the re- sult shows a smile, higher level for equity (lower implied correlation than historical one, but high sensitivity) and senior tranches; mezzanine are cheap in correlation due to the higher demand. The second consists on set the base tranche with the equity one, the other are bootstrapped and are long combination of the equity one; the correlation in this model is an increasing function (winner take all, while the senior bear the “hazard risk”) CDO is a security backed with a different pool of debt obligations (loan, bonds, other CDO and other structured product). [add the process of origination] [ad the tranching process and benefits] 14
  • 15. Giulio Laudani Cod. 20263 CDO are sold to remove asset from balance sheet and to refinancing the originator and to exploit the arbitrage granted by the tranches pricing system, creation of value by credit risk transformation. To price CDO tranches we need to define: the attachment point19, tranche width, portfolio credit quality, RR, maturity and default correlation. From those elements we need to define the Loos distribution, which features are to be highly skewed (the more, the lower the correlation), monotonically decreasing when the correlation increases and U shaped (the higher the correlation). [] Sensitivities: A single name CDS is affected by: Changes in the CDS spread RR estimates hypothesis Counterparty risk A more structured product is affected by the correlation between reference entity and change in CDS spread both in index and single name. Hedging: At first we need to decide if we are going to use index or single name CDS, later one we need to compute the delta hedging (numerically defined as the ratio between the product and its underling percent change given a change in the CDS spread). Each tranche has a specific hedge ratio, it is higher for equity tranches, lower for senior one, since the likelihood of losses absorbed by the junior tranches. Furthermore the cost depends on the CDs used (index is by far more costly), sin- gle name CDS are more costly for equity tranche if the reference entity is riskier, the reverse for the senior, the econom- ic rationale is that an increase of the CDS spread for good quality entities is a signal of overall deterioration of the backed securities. Making more probable a loss for senior tranche, basically all the entities is going to default all togeth- er. The second derivatives sensitivity to CDS spread is negative for equity and positive for senior, while mezzanine depends on the CDO capital structure. The economic rational is: senior are more sensible to deterioration of the credit quality since they might be affected. Those methods highly depends on model assumption on valuing the position and the sensitivity, making them really in- stable and costly (dynamic rebalance is frequent due to the instability of the estimation), moreover the market still doesn’t provide a consensus on the most appropriate model. Hence we may account for model, miss specification risk Sensitivity to correlation (change in MTM by a change of 1% of correlation) is related to the gamma one, since equity are long (winner, win all), senior are negative (riskier scenario in case of default clustering), mezzanine are typically neg- ative 19 That is the lower and upper bound of the loss for each given tranche 15