0% found this document useful (0 votes)
97 views48 pages

Advanced Econometrics: Time Series Analysis

The document outlines a lecture on advanced econometrics. It covers univariate and multivariate time series analysis as well as panel data analysis. The outline includes topics like the basics of time series analysis, autoregressive (AR) and moving average (MA) processes, unit roots, choosing models, autoregressive conditional heteroskedasticity (ARCH), and stationarity. The document provides examples and explanations of concepts like white noise, the first-order AR(1) and MA(1) processes, autocovariances, and the autocorrelation function. The lecture is presented by Robert Kunst from the University of Vienna and Institute for Advanced Studies Vienna.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
97 views48 pages

Advanced Econometrics: Time Series Analysis

The document outlines a lecture on advanced econometrics. It covers univariate and multivariate time series analysis as well as panel data analysis. The outline includes topics like the basics of time series analysis, autoregressive (AR) and moving average (MA) processes, unit roots, choosing models, autoregressive conditional heteroskedasticity (ARCH), and stationarity. The document provides examples and explanations of concepts like white noise, the first-order AR(1) and MA(1) processes, autocovariances, and the autocorrelation function. The lecture is presented by Robert Kunst from the University of Vienna and Institute for Advanced Studies Vienna.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

Univariate time series Multivariate time series Panels

Advanced Econometrics
Based on the textbook by Verbeek:
A Guide to Modern Econometrics

Robert M. Kunst
robert.kunst@univie.ac.at

University of Vienna
and
Institute for Advanced Studies Vienna

April 18, 2013

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Outline

Univariate time series


The basics
General ARMA processes
Unit roots
Choosing a model
ARCH

Multivariate time series

Panels

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

Time-series analysis: the idea

Time-series analysis searches data for dynamic structures that may


be useful in predicting the future. Besides forecasts, the structures
(statistical models) may also reveal features of further interest.

The origin of the observed variable plays a minor role in the


analysis. Identical methods can be used on economic and on
biological data. Time-series analysis is not a branch of economics,
but of statistics.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

White noise

Modern time-series analysis assumes that observed sequences of


data (time series) are realizations of stochastic processes. A
stochastic process (εt ) is called white noise iff
1. Eεt ≡ 0 for all t;
2. varεt ≡ σε2 < ∞ for all t;
3. Eεt εt−j = 0 for all j 6= 0.
White noise is a process with no linear dynamic structure. If data
are white noise, this provides a poor prospect for analysis.
However, white noise is an important building block in more
interesting models.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

The first-order autoregressive process

Assume the generating law

Yt = δ + θYt−1 + εt ,

with δ, θ ∈ R and (εt ) white noise. Then, the process (Yt ) is called
a first-order autoregressive process or AR(1) process.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

Removing the mean from an AR(1) process

Assume |θ| < 1. Then, is it possible to obtain an AR(1) process


with time-constant mean EYt = EYt−1 = µ? It appears to be so,
as this condition and

EYt = E(δ + θYt−1 + εt )


= δ + θE(Yt−1 ) + 0
δ
yield µ = 1−θ . With the definition yt = Yt − µ, one may also write

yt = θyt−1 + εt

for the centered variable yt .

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

The variance of an AR(1) process

Assume |θ| < 1. Is it possible to obtain an AR(1) process with


time-constant variance varYt = varyt = σY2 ? It appears to be so,
as this condition and

varyt = var(θyt−1 + εt )
= θ 2 var(yt−1 ) + σε2
σε 2
yield σY2 = 1−θ 2 . This uses the property that the noise term εt is
uncorrelated with the past observation yt−1 , which is a reasonable
assumption, to some researchers part of the definition of the
AR(1).

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

Autocovariances

The second moments

cov(Yt , Yt−j ) = E(yt yt−j )

are an important characteristic of the joint distribution. If they are


time-constant (independent of t but dependent on j), they are
called autocovariances γj . Note that γ0 = varYt and that
γj = γ−j .

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

Autocovariances for simple processes


For white noise εt , clearly

γj = 0, j 6= 0

and γ0 = σε2 . For the AR(1) process,

γ1 = E(yt yt−1 ) = E{yt−1 (θyt−1 + εt )}


2
= θEyt−1 = θγ0 ,

and, by similar substitution, generally

σε2
γk = θ k γ0 = θ k .
1 − θ2
If all second and first moments are time-constant, autocovariances
decrease geometrically.
Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

Stationarity

A process (Yt ) is called covariance-stationary iff

EYt = µ ∀t,
varYt = σY2 ∀t,
cov(Yt , Yt−k ) = γk ∀t, k.

In short, we will use ‘stationary’ for ‘covariance-stationary’. In some


applications, it may be of interest that the entire distribution is
time-constant, not only the first two moments (strict stationarity).
Clearly, white noise is stationary.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

Is the AR(1) process stationary?

1. If θ = ±1, there is no solution to the variance condition, the


AR(1) can never be stationary;
2. If |θ| > 1, solutions would violate the condition that errors are
uncorrelated with past observations. There are stationary
‘solutions’ that are unreasonable (time runs backward).
Started from given values, the process ‘explodes’;
3. If |θ| < 1, the AR(1) can be stationary. It becomes stationary
if started from a given value and kept running for an infinite
time span. It is stationary if started from the correct
distribution. Sloppily, many researchers call such a process
stationary, others call it stable.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

The first-order moving-average process


The process defined by

Yt = µ + εt + αεt−1

is called the first-order moving-average process or MA(1). The


mean EYt = µ is time-constant, such that yt = Yt − µ has mean
zero. The variance

varYt = σε2 + α2 σε2 = (1 + α2 )σε2

is time-constant. The first-order autocovariance

γ1 = E(yt yt−1 ) = E(εt + αεt−1 )(εt−1 + αεt−2 )


= ασε2

is time-constant.
Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

Finite linear dependence in MA processes

The second-order autocovariance

γ2 = E(yt yt−2 ) = E(εt + αεt−1 )(εt−2 + αεt−3 )

is zero, and similarly for all γk with k ≥ 2. The MA(1) process is


stationary, and it is finite dependent in linear terms: observations
at time distance greater one are linearly unrelated. MA models
‘forget fast’, they have very short ‘memory’.
Traded wisdom is that AR models are more often found in
economic data than MA models.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

The autocorrelation function

Rather than the autocovariances γk , most researchers prefer the


normalized autocorrelations
γk
ρk = ,
γ0
which by definition are bounded: −1 ≤ ρk ≤ 1. Because of
ρk = ρ−k , they can be visualized as a function of the non-negative
integers: the autocorrelation function or ACF.
The empirical ACF or sample ACF is sometimes called the
correlogram. Others use the term ‘correlogram’ for a visual
summary of the empirical ACF and the empirical PACF.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

The basics

The ACF of simple processes

◮ The ACF of white noise is



1 , k = 0,
ρk =
0 , k > 0.

◮ The ACF of the stationary AR(1) process is

ρk = θ k , k ∈ N.

◮ The ACF of the MA(1) process is


α
ρ0 = 1, ρ1 = , ρk = 0, k ≥ 2.
1 + α2

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

General ARMA processes

A moving-average process of order q or MA(q) is defined by

yt = εt + α1 εt−1 + . . . + αq εt−q .

Similarly, an autoregressive process of order p or AR(p) is defined


via
yt = θ1 yt−1 + . . . + θp yt−p + εt .
An amalgam of the two is the ARMA(p, q) process

yt = θ1 yt−1 + . . . + θp yt−p + εt + α1 εt−1 + . . . + αq εt−q .

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

Infinite-order MA processes

The stable AR(1) model can be subjected to repeated substitution

yt = θyt−1 + εt = θ(θyt−2 + εt−1 ) + εt


= θ k yt−k + εt + θεt−1 + . . . + θ k−1 εt−k+1
X∞
= θ j εt−j ,
j=0

as the term depending on yt−k disappears in the limit. This is an


MA(∞) process, sensibly defined only if (as here) the coefficients
converge to 0 fast enough. All stable AR processes can be
represented as such MA(∞) processes.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

The lag operator


An operator is a function defined on a set or on its power set with
the image in the same set. The lag operator L is defined by
Lyt = yt−1
and operates on processes, observations etc. It is often mainly a
notational device. Its powers are well defined by
L0 yt = yt , L−1 yt = yt+1 , Lk yt = yt−k ,
and there are also lag polynomials in L:
(1 − θ1 L − θ2 L2 − . . . − θp Lp )yt = yt − θ1 yt−1 − . . . − θp yt−p = εt
writes the AR(p) model. In short, we may write
θ(L)yt = εt .

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

ARMA in lag polynomials

Using
θ(L) = 1 − θ1 L − θ2 L2 − . . . − θp Lp
and
α(L) = 1 + α1 L + α2 L2 + . . . + αq Lq ,
the ARMA(p, q) process can be written compactly as

θ(L)yt = α(L)εt .

This is short, and it also admits many simple manipulations.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

Inverting lag polynomials


In general, the inverse of a polynomial is not a polynomial. Under
certain conditions, it can be written as a convergent power series:

X
θ(z) = 1 − θz ∴ θ −1 (z) = θj z j
j=0

converges for |z| ≤ 1 if |θ| < 1. Thus,

(1 − θL)yt = εt ∴ yt = (1 − θL)−1 εt

yields the MA(∞) representation of an AR(1) process. Under


certain conditions, the expressions

yt = θ −1 (L)α(L)εt , α−1 (L)θ(L)yt = εt

work for general ARMA processes.


Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

Characteristic polynomials
O.c.s. that the inverses of lag polynomials exist iff all ‘roots’
(zeros) of the corresponding characteristic polynomials are larger
than one in modulus:

θ(L) = 1 − θ1 L − θ2 L2 − . . . − θp Lp

has the corresponding characteristic polynomial

θ(z) = 1 − θ1 z − θ2 z 2 − . . . − θp z p ,

and this condition means that all ζ with θ(ζ) = 0 must have the
property |ζ| > 1. Likewise, the MA lag polynomial has the
corresponding characteristic polynomial

α(z) = 1 + α1 z + . . . + αq z q .

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

Characteristic polynomials for p = 1


For the AR(1) process, the characteristic polynomial is

θ(z) = 1 − θz,

and its only root is larger one iff |θ| < 1. Then, the MA(∞)
representation exists, and the AR(1) is stable. Generally, roots are
large when coefficients are small.
For the MA(1) process, the characteristic polynomial is

α(z) = 1 + αz,

and its only root is larger one iff |α| < 1. Then, the process has an
AR(∞) representation, which may be convenient for prediction.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

Canceling or common roots


Suppose that, in the representation of an ARMA process

yt = θ −1 (L)α(L)εt , α−1 (L)θ(L)yt = εt ,

some of the roots in the characteristic polynomials θ(z) and α(z)


coincide. The fundamental theorem of algebra supports the
representation (ζj , j = 1, . . . , p are the roots)

θ(z) = (1 − ζ1−1 z)(1 − ζ2−1 z) . . . (1 − ζp−1 z),

with maybe some complex conjugates, and similarly for α(z). The
factors in the expressions θ −1 (L)α(L) cancel, and the ARMA(p, q)
process is equivalent to an ARMA(p − 1, q − 1) process.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

General ARMA processes

An example for common roots

Consider the ARMA(2,1) model

yt = yt−1 −0.25yt−2 +εt −0.5εt−1 ∴ (1−L+0.25L2 )yt = (1−0.5L)εt .

Because of 1 − z + 0.25z 2 = (1 − 0.5z)2 , there is a common root


of z = 2, and the defined process is really the AR(1) model

yt = 0.5yt−1 + εt .

Common roots should be avoided. Representations become


non-unique, and attempts to fit an ARMA(2,1) model to AR(1)
data imply numerical problems.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Unit roots

The random walk


The AR(1) process with θ = 1

yt = yt−1 + εt

is called the random walk. It is not stationary. The root of its


characteristic polynomial 1 − z is 1. Its first-order difference

∆yt = yt − yt−1 = εt

is stationary. O.c.s. that this property is shared by all ARMA


processes with exactly one root of 1 in their AR polynomial:
non-stationary but first difference stationary. Such processes are
called first-order integrated or I(1).

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Unit roots

I(1) processes and the lag operator

Suppose that, in the ARMA model

θ(L)yt = α(L)εt ,

the polynomial θ(z) has exactly one root of 1 and all other roots
are nice (modulus greater one). Then, we may write

θ(z) = (1 − z)θ ∗ (z) ∴ θ ∗ (L)(1 − L)yt = α(L)εt .

A valid ARMA representation exists for ∆yt = yt∗ with θ ∗ (z) and
α(z) both invertible, and yt is definitely I(1).

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Unit roots

Higher-order integration

If the AR polynomial θ(z) has exactly two roots of 1 and all other
roots are nice, the transform

∆2 yt = (1 − L)2 yt = yt − 2yt−1 + yt−2

will be stationary but yt and ∆yt will not be stationary. The


process (yt ) is said to be second-order integrated or I(2).
Box and Jenkins suggested the notation ARIMA(p, d, q) for a
process with d roots of one and all other roots nice. Usually, only
d ∈ {0, 1, 2} occurs with economic variables.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Unit roots

Testing for a unit root in AR(1)


Apparently, testing for unit roots is of interest: I(1) processes are
not stationary, ‘all shocks have a permanent effect’, and taking
differences is recommended.
Dickey and Fuller considered testing H0 : θ = 1 in the model

Yt = δ + θYt−1 + εt ,

using the simple t–statistic

θ̂ − 1
DF = ,
s.e.(θ̂)

where the denominator is just taken from the usual regression


output.
Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Unit roots

Dickey-Fuller distribution

O.c.s. that the statistic DF defines a valid test, but that DF is


neither t–distributed nor normally distributed under H0 . The
distribution is non-standard and was tabulated first in 1976.
If a linear time trend is added to the regression

∆Yt = δ + (θ − 1)Yt−1 + γt + ut ,

the distribution of DF is again different. Versions with mean only


and with trend are often called DF–µ and DF–τ .

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Unit roots

Why include a linear trend?


For the regression model

∆Yt = δ + (θ − 1)Yt−1 + ut ,

H0 : θ = 1 defines the random walk with drift, HA : θ ∈ (−1, 1) a


stable AR(1).
In the regression model

∆Yt = δ + (θ − 1)Yt−1 + γt + ut ,

H0 : θ = 1 admits a random walk with quadratic trend


superimposed, HA : θ ∈ (−1, 1) a trend-stationary AR(1), i.e. a
process that becomes stationary after removing a trend. This
second problem is maybe more relevant for trending data.
Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Unit roots

Unit-root testing in an AR(2) model

Consider the AR(2) model

Yt = δ + θ1 Yt−1 + θ2 Yt−2 + εt ,

which can be re-written by some algebraic manipulation as

∆Yt = δ + (θ1 + θ2 − 1)Yt−1 − θ2 ∆Yt−1 + εt .

There will be a unit root in θ(z) iff the coefficient θ1 + θ2 − 1 is 0.


The t–statistic on the coefficient of Yt−1 has the same properties
as the DF in an AR(1) model: ‘augmented’ Dickey-Fuller test,
really just a DF test.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Unit roots

Unit-root testing in an AR(p) model

Testing for a unit root in the AR(p) model can be conducted by


the t–statistic on π̂ in the regression

∆Yt = δ[+γt] + πYt−1 + c1 ∆Yt−1 + . . . + cp−1 ∆Yt−p+1 + ut .

Significance points will be identical to the DF tests in the AR(1)


model. O.c.s. that π = θ1 + . . . + θp − 1 is 0 for an I(1) process.

◮ If the DF test rejects, Yt may be stable, stationary,


trend-stationary.
◮ If the DF test does not reject, Yt may be I(1).

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Choosing a model

Order selection: the idea

How determine the orders p and q in an ARMA model? How


determine the augmentation order in a DF test application?
The problem is that methods that choose a parameter from a finite
or countable set are less developed than hypothesis testing and
estimation on a continuum. Fixes:
◮ Visual tools (require individual skills)
◮ Sequences of hypothesis tests (significance levels incorrect)
◮ Information criteria

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Choosing a model

The autocorrelation function

In MA(q) models, the ACF values ρk are non-zero for k ≤ q and


zero for k > q. The sample ACF can be used to determine q.
Under conditions, o.c.s. that, for k > q, i.e. for those ρk that are
really zero, √
T (ρ̂k − ρk ) → N (0, υk ),
where
υk = 1 + 2ρ21 + . . . + 2ρ2q .
This property can be used for testing H0 : ρk = 0 and for drawing
confidence bands, plugging in estimates for ρ1 , . . . , ρq .

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Choosing a model

The partial autocorrelation function


For AR and ARMA models, the sample ACF plots are not helpful.
Rather, it is suggested to fit AR(p) models with increasing p:

Yt = δ + θ1.1 Yt−1 + ut ,
Yt = δ + θ1.2 Yt−1 + θ2.2 Yt−2 + ut ,
Yt = δ + θ1.3 Yt−1 + θ2.3 Yt−2 + θ3.3 Yt−3 + ut ,

and note down the last coefficient estimate θ̂1.1 , θ̂2.2 , θ̂3.3 , . . ..
Their population counterparts should be non-zero for k = p and
zero for k > p. O.c.s. that

T θ̂k.k → N (0, 1), k > p.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Choosing a model

The classical visual oder determination

◮ If the ACF decays smoothly and the PACF cuts off at lag p,
try an AR(p) model;
◮ If the PACF decays smoothly and the ACF cuts off at lag q,
try an MA(q) model;
◮ If both ACF and PACF decay smoothly, this may be an
ARMA(p, q) model, but you do not know p nor q;
◮ If the ACF decays very slowly, the variable may correspond to
an I(1) or even I(2) process, and you may wish to take
differences.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Choosing a model

Information criteria: the idea

Information criteria are penalized likelihoods: more complex


models have a larger likelihood, and penalizing complexity
eventually leads to a reasonable choice. By convention,
information criteria rely on the negative log-likelihood, are minimal
at the optimum and are often negative.
Information criteria are not an alternative to likelihood-ratio tests
or at odds with them. Under conditions, they are equivalent for
nested comparisons. Many comparisons are non-nested (such as
ARMA(2,1) versus ARMA(1,3)), and direct hypothesis tests
cannot be used.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Choosing a model

The information criterion AIC

Akaike has suggested to penalize complexity linearly:


p+q+1
AIC = log σ̂ 2 + 2 ,
T
with σ̂ 2 an estimate for the error variance. The variance term
represents minus the log-likelihood. The count of parameters (here
p + q + 1) varies among authors (mean, variance).

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Choosing a model

The information criterion BIC

Schwarz suggested a simplified version of a BIC that had been


introduced by Akaike and penalizes complexity by a logarithmic
function of the sample size:
p+q+1
BIC = log σ̂ 2 + log T .
T
O.c.s. that minimizing BIC leads to the true p and q as T → ∞.
By contrast, minimizing AIC optimizes the prediction properties of
the selected model and may be preferable in smaller samples.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

ARCH

ARCH models: the idea


Financial time series, such as stock prices and exchange rates,
often show little predictability in their means beyond their levels,
their log-differences are white noise. However, the following
features are found:
◮ Heavy tails: there are far more unusually large and small
observations than for Gaussian data. There is substantial
excess kurtosis.
◮ Volatility clustering: episodes with large and small variation
follow each other. Large changes in the level are often
succeeded by more large moves, with their direction being
unpredictable.

The ARCH (autoregressive conditional heteroskedasticity) model


by Engle (1982) captures these features successfully.
Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

ARCH

The ARCH(1) model

The simplest ARCH model is the ARCH(1) model

σt2 = E(ε2t |It−1 ) = ̟ + αε2t−1 ,

where (εt ) is a white-noise process, It−1 denotes an information


set containing all εs , s ≤ t − 1, and ̟ > 0, α ≥ 0.
σt2 is a local variance and represents volatility.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

ARCH

Stationary ARCH(1)

If an ARCH(1) process (εt ) is stationary, its variance must be


time-constant. Taking expectations yields

σε2 = Eσt2 = ̟ + αE(ε2t−1 )


= ̟ + ασε2 ,

and hence
̟
σε2 = .
1−α
Indeed, o.c.s. that stationary ARCH(1) processes exist iff
α ∈ [0, 1). Whereas local volatility changes over time, the
unconditional variance is time-constant.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

ARCH

Higher-order ARCH models

The model
σt2 = ̟ + α1 ε2t−1 + . . . + αp ε2t−p
defines an ARCH(p) process. Like the ARCH(1) process, it can be
easily generated from a purely random series νt iteratively by
v
u Xp
u
εt = σt νt = νt ̟ +
t αj ε2t−j .
j=1

Stationarity would require α1 + . . . + αp < 1.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

ARCH

GARCH models
The GARCH (generalized ARCH) model by Bollerslev lets
volatility depend also on its own past. For example, the
GARCH(1,1) reads

σt2 = ̟ + αε2t−1 + βσt−1


2
,

with α, β ≥ 0 and α = 0 ⇒ β = 0. This is a very popular model


and fits some financial time series surprisingly well. Stationarity
requires α + β < 1. Then, it is easy to show that
σ 2 = ̟/(1 − α − β).
It is straightforward to define GARCH(p, q) models with larger
orders.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

ARCH

Integrated GARCH

Often, in estimated GARCH(1,1) models α̂ + β̂ ≈ 1. For


α + β = 1, the name is IGARCH (integrated GARCH). IGARCH
processes may even be stationary, with infinite variance. Here, one
needs the alternative definition of strict stationarity (time-constant
distribution) rather than covariance stationarity (covariances do
not exist).
Generally, ARCH processes have large or infinite kurtosis, though
infinite variance may be extreme and is usually not supported in
empirical finance.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

ARCH

Exponential GARCH model

The EGARCH (exponential GARCH) model by Daniel Nelson


is the most popular nonlinear ARCH–type model:

εt−1 |εt−1 |
log σt2 = ̟ + β log σt−1
2
+γ +α ,
σt−1 σt−1

with γ 6= 0 implying asymmetric reaction. EGARCH needs no


positivity constraints on parameters. Often, the model fits well
empirically.

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna
Univariate time series Multivariate time series Panels

Advanced Econometrics University of Vienna and Institute for Advanced Studies Vienna

You might also like