0% found this document useful (0 votes)
28 views32 pages

Lecture 2 Time Series Analysis

The document provides an overview of time series analysis, emphasizing the characteristics of economic data, types of time series models, and the importance of stationarity. It discusses various models such as autoregressive (AR), moving-average (MA), and their combinations (ARMA), along with methods for identifying these models. Additionally, it highlights the significance of understanding dynamic structures in economic and financial data for accurate forecasting and policy analysis.

Uploaded by

cider.sloes.5c
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views32 pages

Lecture 2 Time Series Analysis

The document provides an overview of time series analysis, emphasizing the characteristics of economic data, types of time series models, and the importance of stationarity. It discusses various models such as autoregressive (AR), moving-average (MA), and their combinations (ARMA), along with methods for identifying these models. Additionally, it highlights the significance of understanding dynamic structures in economic and financial data for accurate forecasting and policy analysis.

Uploaded by

cider.sloes.5c
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Time series analysis

Lecture 2
Textbooks
❖Enders, W. (2010), Applied Econometric Time series Analysis, 3rd Edition,
John Willey & Sons:New York.

❖Tsay, R. S. (2005), Analysis of Financial Time Series, 2rd Edition, John Willey
and Sons: New York.
Lecture Objectives

❖Characteristics of economic data


❖Know the types of time series models
❖What is stationarity?
❖Know how to identify models
❖Reduce-form vs structural equations
Time series analysis
❖A time series is an ordered temporal variable.
❖Time series is a collection of observations indexed by the date of each
realisation.
❖Using notation that starts at time , t = 1 and using the end point, t = T,
{𝑦1 , 𝑦2 , 𝑦3 , … , 𝑦𝑇 }
❖Time index can be of any frequency (e.g. daily, weekly, monthly, quarterly,
etc.)
Time series analysis
❖After investigating the behaviour of these variables we may gain a better
understanding of the past and in certain cases we can predict the future.
❖Why do we need a separate area of study for investigations that involve time series
variables?
❖Consider the simple linear regression model
❖𝑦𝑡 = 𝛽𝑥𝑡 + 𝜀𝑡 , t = 1, … , T
❖Errors should not be serially correlated for least squares estimates in such a model.
Time series analysis
❖Coefficient estimates are inefficient when there is a serial correlation in the
errors so standard errors may be incorrect.

❖If we incorporate the dynamic properties of the variables in the explained


part of the model then the serial correlation may be removed from the
errors.
Time series analysis
❖Most economic and financial time series exhibit some form of serial
correlation.
➢If economic output is large during the previous quarter then there is a good
chance that it is going to be large in the current quarter.
❖Particular shock may affect variables over successive quarters.
❖Hence, we need to start thinking about the dynamic structure of the data
generating process.
Time series analysis
❖The literature on various forms of time series analysis is extensive.
➢Large part of the literature is concerned with forecasting, which remains an
import area of study.
➢There is also an important section that involves specific tests for various
hypotheses (or theories).
➢These models are also used for policy analysis purposes.
Time series analysis
❖Irrespective of the objective we usually need to identify the dynamic evolution of
these variables.
❖Which would usually start with some form of decomposition of the time
series into its constituent components.
Real World Data
Economic and Financial Data
❖Consider the information content that is contained in the series.
❖Most economic data contains trends, seasonals, and irregular components.
❖Most of the data is measured in discrete time (with relatively long intervals)
❖They may be expressed as rates, indices or totals.
❖Be cautious of using interpolation to generate higher frequency data
Economic and Financial Data
❖Many of the respective series are subject to revision.
❖Common transformations include: calculation of growth rates[log(GDPt/GDPt-1)]
❖One may need to ensure that the variables are measured at the same frequency.
❖Most countries follow globally accepted measurement practices.
❖Most financial studies involve returns
Economic and Financial Data
❖Data on stock prices and indices is overwhelming:
➢Does the data contain true trading prices, quotes, or proxies for trading
prices?
➢Do the prices include transaction costs, and commissions?
➢Is the market sufficiently liquid?
➢Have the prices been adjusted for inflation?
➢Have they been correctly discounted?
Economic and Financial Data
❖May only be interested in buyer initiated (ask) or seller initiated (bid) orders
❖Transformation to returns generally displays stationary behaviour
❖Represents a complete scale-free summary about an investment opportunity.
Notations of time series
❖𝑥𝑡 = the current value of x.
❖𝑥𝑡−1 =the lag of variable x (value of x one period ago)
❖𝑥𝑡+1 =the lead of variable x (value of x one period into the future)
❖Δ is the difference operator
❖Δ𝑥𝑡 =𝑥𝑡 -𝑥𝑡−1 :the first difference of 𝑥𝑡
Types of time series models
❖Static models: 𝑦𝑡 =𝛼0 + 𝛼1 𝑥𝑡 + 𝜀𝑡
❖Autoregressive models (AR):𝑦𝑡 = 𝛼0 + 𝛼1 𝑦𝑡−1 + 𝜀𝑡
❖Moving-average models (MA):𝑦𝑡 = 𝜀𝑡 + 𝛽1 𝜀𝑡−1
❖Autoregressive moving-average models (ARMA)
❖Distributed lag models (DL):𝑦𝑡 = 𝛼0 + 𝛼1 𝑥𝑡 + 𝛼2 𝑥𝑡−1 + 𝜀𝑡
❖Autoregressive distributed lad models (ARDL)
❖𝑦𝑡 =𝛼0 + 𝛼1 𝑦𝑡−1 + 𝛼2 𝑥𝑡−1 + 𝜀𝑡
Stationarity
❖The foundation of time series analysis is stationarity.
❖A time series is strictly stationary if the joint distribution is invariant under
time shift.
➢This is a strong condition that is hard to verify empirically.
Stationarity
❖ A time series is covariance-stationary if its mean and all autocovariances are unaffected by a
change of time origin.
➢ In the literature, a covariance-stationary process is also referred to as a weakly stationary.
❖ A stochastic process is covariance-stationary if
E(𝑦𝑡 ) = E(𝑦𝑡−𝑠 ) =μ
E[(𝑦𝑡 − μ)2 ] = E[(𝑦𝑡−𝑠 -μ)^2] =σ2𝑦
E[(𝑦𝑡 -μ)(𝑦𝑡−𝑠 )] = E[(𝑦𝑡−𝑗 -μ)(𝑦𝑡−𝑗−𝑠 -μ)] =𝛾𝑠
Where μ, 𝜎𝑦2 , and 𝛾𝑠 are all constants.
Autoregressive Models
❖𝑟𝑡 =∅0 +∅1 𝑟𝑡−1 +𝜀𝑡
➢𝑟𝑡 is the dependent variable.
➢𝑟𝑡−1 is the explanatory variable
➢This is an AR(1)
➢𝜀𝑡 is a white noise series.
Autoregressive Models
❖A sequence, 𝜀𝑡 , is a white-noise process if each value in the sequence has a mean of
zero, has a constant variance, and is uncorrelated with all other realizations.
❖The AR(1) model has similar properties to those of the simple linear regression
model.
❖The AR(1) model implies that, conditional on the past return 𝑟𝑡−1 , we have
➢ E(𝑟𝑡 | 𝑟𝑡−1 ) = ∅0 +∅1 𝑟𝑡−1
➢ Var (𝑟𝑡 | 𝑟𝑡−1 ) = Var(𝜀𝑡 ) = 𝜎𝜀2
Autoregressive Models
❖There are situations in which 𝑟𝑡−1 alone cannot determine the conditional
expectation of 𝑟𝑡 and a more flexible model must be sought.
➢Generalization of the AR(1) model to AR(p) model
➢𝑟𝑡 =∅0 + ∅1 𝑟𝑡−1 + … + ∅𝑝 𝑟𝑡−𝑝 + 𝜀𝑡
➢Where p is a non-negative integer.
➢The AR(p) model is in the same form as a multiple linear regression model
with lagged values serving as the explanatory variable.
Autoregressive Models
❖For a stationary AR(1) process, the constant term is related to the mean of
the variable.
❖The AR(1) model is weakly stationary.
➢The necessary and sufficient condition for the AR(1) model to be weakly
stationary is | ∅1 |<1
❖The autocorrelation function of a weakly stationary AR(1) series decays
exponentially.
Identifying AR Models in practice
❖The order p of an AR time series is unknown and must be specified
empirically.
➢This is referred to as the order determination of AR models.
❖Two general approaches are available for determining the value of p:
➢Use the partial autocorrelation function (PACF)
➢Use some information criterion function
Information criterion
❖Akaike Information criterion (AIC)
❖AIC =−2
𝑇
In likelihood + 2
𝑇
*(number of parameters)
❖Where the likelihood function is evaluated at the maximum likelihood estimates and
T is the sample size.
❖For a Gaussian AR(l) model, AIC reduces to: AIC(l)=In(𝜎𝑙2 )+2𝑙𝑇
❖Where 𝜎𝑙2 is the maximum likelihood estimate of 𝜎𝜀2 , which is the variance of 𝜀𝑡 . T
is the sample size.
❖Choose order with the minimum AIC value.
Schwarz Bayesian information criterion
❖Another commonly used criterion function is the BIC.
❖For a Gaussian AR(l) model, the criterion is
❖BIC(l) = In(𝜎𝑙2 )+𝑙𝑙𝑛(𝑇)
𝑇

❖The penalty for each parameter used is 2 for AIC and ln(T) for BIC
➢Thus, BIC tends to select a lower AR model when the sample size is
moderate or large.
Autoregressive Models
❖If the model is adequate, then the residual series should behave as a white
noise.
➢Residual serial correlation should be insignificant at 1% and 5%.
➢If not, try different lags(lag number)
❖Choose a model with a better goodness of fit.
➢Adjusted 𝑅2 .
Autoregressive Models
Moving-Average Models
❖Multiplying values of a white noise process by an associated value of 𝛽𝑖 is
called a moving average.
❖𝑥𝑡 =σ𝑞𝑖=0 𝛽𝑖 𝜀𝑡−𝑖
❖A moving average of order q is denoted by MA(q)
❖Can use the ACF to identify the order of MA model.
❖Use Maximum likelihood estimation to estimate MA models.
ARMA models
❖𝑦𝑡 = 𝑎0 + σ𝑝𝑖=1 𝑎𝑖 𝑦𝑡−𝑖 + σ𝑞𝑖=0 𝛽𝑖 𝜀𝑡−𝑖
❖If the characteristic roots of this eqn are all in the unit circle, 𝑦𝑡 is called an
autoregressive moving-average (ARMA)
❖If one or more characteristic roots of this eqn is greater than or equal to
unity, then 𝑦𝑡 is said to be an integrated process and is called an
autoregressive integrated moving-average (ARIMA) model.
❖The stability condition is a necessary condition for the time-series 𝑦𝑡 to be
stationary.
Reduced-Form and Structural Equations
❖Consider the Samuelson model:
❖𝑦𝑡 =𝑐𝑡 +𝑖𝑡
❖𝑐𝑡 =α𝑦𝑡−1 +𝜀𝑐𝑡 where 0<α<1
❖𝑖𝑡 =β(𝑐𝑡 -𝑐𝑡−1 )+𝜀𝑖𝑡 where β>0.

You might also like