CH 09
CH 09
CH 09
Walter R. Paczkowski
Rutgers University
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 1
Chapter Contents
9.1 Introduction
9.2 Finite Distributed Lags
9.3 Serial Correlation
9.4 Other Tests for Serially Correlated Errors
9.5 Estimation with Serially Correlated Errors
9.6 Autoregressive Distributed Lag Models
9.7 Forecasting
9.8 Multiplier Analysis
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship:
1. Specify that a dependent variable y is a
function of current and past values of an
explanatory variable x
Eq. 9.1 yt f ( xt , xt 1, xt 2 ,...)
9.1.1
Dynamic Nature of Ways to model the dynamic relationship (Continued):
Relationships
2. Capturing the dynamic characteristics of time-
series by specifying a model with a lagged
dependent variable as one of the explanatory
variables
Eq. 9.2 yt f ( yt 1 , xt )
• Or have:
Eq. 9.3 yt f ( yt 1 , xt , xt 1 , xt 2 )
– Such models are called autoregressive
distributed lag (ARDL) models, with
‘‘autoregressive’’ meaning a regression of yt
on its own lag or lags
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 8
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship (Continued):
3. Model the continuing impact of change over
several periods via the error term
9.1.2
Least Squares
Assumptions
The primary assumption is Assumption MR4:
9.1.2a
Stationarity
9.1.2a
Stationarity
9.1.2a
Stationarity
9.1.2a
Stationarity
9.1.3
Alternative Paths
Through the
Chapter
9.1.3
Alternative Paths
Through the
Chapter
Eq. 9.6 yT 1 0 xT 1 1 xT 2 xT 1 q xT q 1 eT 1
– Policy analysis
• What is the effect of a change in x on y?
E ( yt ) E ( yt s )
Eq. 9.7 s
xt s xt
9.2.1
Assumptions
TSMR1. yt β0 xt β1 xt 1 β2 xt 2 βq xt q et , t q 1, , T
TSMR2. y and x are stationary random variables, and et is independent of
current, past and future values of x.
TSMR3. E(et) = 0
TSMR4. var(et) = σ2
TSMR5. cov(et, es) = 0 t ≠ s
TSMR6. et ~ N(0, σ2)
9.2.2
An Example: Consider Okun’s Law
Okun’s Law
Eq. 9.8 Ut Ut 1 Gt GN
– We can rewrite this as:
Eq. 9.9 DUt β0Gt et
where DU = ΔU = Ut - Ut-1, β0 = -γ, and
α = γGN
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 23
9.2
Finite Distributed
Lags
9.2.2
An Example:
Okun’s Law
GDPt GDPt 1
Eq. 9.11 Gt 100
GDPt 1
9.2.2
An Example:
Okun’s Law
9.2.2
An Example:
Okun’s Law
9.2.2
An Example:
Okun’s Law
9.2.2
An Example:
Okun’s Law
9.3.1
Serial Correlation
in Output Growth
9.3.1a
Computing
Autocorrelation
9.3.1a
Computing For the Okun’s Law problem, we have:
Autocorrelation
cov Gt , Gt 1 cov Gt , Gt 1
ρ1
var Gt var Gt 1 var Gt
Eq. 9.12
9.3.1a
Computing
Autocorrelation
cov Gt , Gt 1
1 T
T 1 t 2
Gt G Gt 1 G
var Gt
1 T
Gt G
2
T 1 t 1
9.3.1a
Computing
Autocorrelation
G G G G
T
t t 1
Eq. 9.13 r1 t 2
G G
T
2
t
t 1
9.3.1a
Computing
Autocorrelation
t 1
9.3.1a
Computing
Autocorrelation
T t 1
9.3.1a
Computing
Autocorrelation
9.3.1a
Computing
Autocorrelation
9.3.1a
Computing
Autocorrelation
For our problem, we have:
9.3.1b
The Correlagram
9.3.1b
The Correlagram
9.3.2
Serially Correlated
Errors
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve The k-th order autocorrelation for the residuals can
be written as: T
eˆ eˆ t t k
Eq. 9.21 rk t k 1
T
t
ˆ
e 2
t 1
– The least squares equation is:
9.3.2a
A Phillips Curve
9.4.1
A Lagrange
Multiplier Test
9.4.1
A Lagrange
Multiplier Test
9.4.1
A Lagrange
Multiplier Test
9.4.1
A Lagrange
Multiplier Test
9.4.1
A Lagrange
Multiplier Test
To derive the relevant auxiliary regression for the
autocorrelation LM test, we write the test equation
as:
Eq. 9.25 yt β1 β2 xt ρeˆt 1 vt
b1 b2 xt eˆt β1 β2 xt ρeˆt 1 vt
9.4.1
A Lagrange
Multiplier Test
Rearranging, we get:
eˆt β1 b1 β 2 b2 xt ρeˆt 1 vt
Eq. 9.26
γ1 γ 2 xt ρeˆt 1 v
– If H0: ρ = 0 is true, then LM = T x R2 has an
approximate χ2(1) distribution
• T and R2 are the sample size and goodness-
of-fit statistic, respectively, from least
squares estimation of Eq. 9.26
9.4.1
A Lagrange
Multiplier Test
Considering the two alternative ways to handle ê0 :
iii LM
T 1 R 2
89 0.3102 27.61
iv LM T R 2 90 0.3066 27.59
– These values are much larger than 3.84, which
is the 5% critical value from a χ2(1)-distribution
• We reject the null hypothesis of no
autocorrelation
– Alternatively, we can reject H0 by examining
the p-value for LM = 27.61, which is 0.000
9.4.1a
Testing Correlation
at Longer Lags
9.4.2
The Durbin-
Watson Test
9.5.1
Least Squares
Estimation
Suppose we proceed with least squares estimation
without recognizing the existence of serially
correlated errors. What are the consequences?
1. The least squares estimator is still a linear
unbiased estimator, but it is no longer best
2. The formulas for the standard errors usually
computed for the least squares estimator are
no longer correct
• Confidence intervals and hypothesis tests
that use these standard errors may be
misleading
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 65
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
9.5.1
Least Squares
Estimation
Consider the model yt = β1 + β2xt + et
– The variance of b2 is:
where
wt xt x x x
2
t t
9.5.1
Least Squares
Estimation
When the errors are not correlated, cov(et, es) = 0,
and the term in square brackets is equal to one.
– The resulting expression
9.5.1
Least Squares
Estimation
9.5.1
Least Squares
Estimation
9.5.1
Least Squares
Estimation
9.5.2
Estimating an
AR(1) Error Model
9.5.2
Estimating an
AR(1) Error Model
Eq. 9.30 describes a first-order autoregressive
model or a first-order autoregressive process for
et
– The term AR(1) model is used as an
abbreviation for first-order autoregressive
model
– It is called an autoregressive model because it
can be viewed as a regression model
– It is called first-order because the right-hand-
side variable is et lagged one period
9.5.2a
Properties of an
AR(1) Error
We assume that:
Eq. 9.32 1 ρ 1
The mean and variance of et are:
2
Eq. 9.33 E et 0 var et 2 v
1 ρ2
e
9.5.2a
Properties of an The correlation implied by the covariance is:
ρ k corr et , et k
AR(1) Error
cov et , et k
var et var et k
cov et , et k
var et
Eq. 9.35
ρ k v2 1 ρ 2
v2 1 ρ 2
ρk
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 75
9.5
Estimation with
Serially Correlated
Errors
9.5.2a
Properties of an Setting k = 1:
AR(1) Error
9.5.2a
Properties of an
AR(1) Error
Each et depends on all past values of the errors vt:
ρˆ 1 ρˆ r1 0.549
9.5.2a
Properties of an
AR(1) Error
ρˆ 2 ρˆ 0.549 0.301
2 2
ρˆ 3 ρˆ 3 0.549 0.165
3
ρˆ 4 ρˆ 0.549 0.091
4 4
ρˆ 5 ρˆ 5 0.549 0.050
5
9.5.2b
Nonlinear Least
Squares Estimation
9.5.2b
Nonlinear Least
Squares Estimation
With the appropriate substitutions, we get:
Eq. 9.41 et 1 yt 1 β1 β2 xt 1
– Multiplying by ρ:
9.5.2b
Nonlinear Least
Squares Estimation
Substituting, we get:
Eq. 9.43 yt β1 1 ρ β2 xt ρyt 1 ρβ2 xt 1 vt
9.5.2b
Nonlinear Least
Squares Estimation
9.5.2b
Nonlinear Least
Squares Estimation
Our Phillips Curve model assuming AR(1) errors
is:
Eq. 9.44 INFt β1 1 ρ β2 DUt ρINFt 1 ρβ2 DUt 1 vt
9.5.2c
Generalized Least
Squares Estimation
9.5.3
Estimating a More
General Model
We have the model:
Eq. 9.47 yt δ θ1 yt 1 δ0 xt δ1 xt 1 vt
9.5.3
Estimating a More
General Model
9.5.3
Estimating a More
General Model
9.5.3
Estimating a More
General Model
9.5.3
Estimating a More
General Model
9.5.3
Estimating a More
General Model
9.5.4
Summary of We have described three ways of overcoming the
Section 9.5 and
Looking Ahead effect of serially correlated errors:
1. Estimate the model using least squares with
HAC standard errors
2. Use nonlinear least squares to estimate the
model with a lagged x, a lagged y, and the
restriction implied by an AR(1) error
specification
3. Use least squares to estimate the model with a
lagged x and a lagged y, but without the
restriction implied by an AR(1) error
specification
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 91
9.6
Autoregressive Distributed Lag
Models
– Two examples:
ADRL 1,1 : INFt 0.3336 0.5593INFt 1 0.6882 DU t 0.3200 DU t 1
ADRL 1, 0 : INFt 0.3548 0.5282 INFt 1 0.4909 DU t
9.6.1
The Phillips Curve
Eq. 9.56
INF t 0.3548 0.5282 INFt 1 0.4909 DU t , obs 90
se 0.0876 0.0851 0.1921
9.6.1
The Phillips Curve
9.6.1
The Phillips Curve
9.6.1
The Phillips Curve
Eq. 9.57
se 0.0983 0.1016 0.1038 0.1050
0.2819INFt -4 0.7902DU t
0.1014 0.1885 obs 87
9.6.1
The Phillips Curve
9.6.1
The Phillips Curve
9.6.2
Okun’s Law
9.6.2
Okun’s Law
9.6.2
Okun’s Law
9.6.2
Okun’s Law
9.6.3
Autoregressive
Models
Eq. 9.60 yt δ θ1 yt 1 θ2 yt 2 θ p yt p vt
9.6.3
Autoregressive
Models
9.6.3
Autoregressive
Models
9.6.3
Autoregressive
Models
9.7.1
Forecasting with
an AR Model
GT 1 δ θ1GT θ2GT 1 vT 1
9.7.1
Forecasting with
an AR Model
9.7.1
Forecasting with
an AR Model
For two quarters ahead, the forecast for G2010Q1 is:
9.7.1
Forecasting with
an AR Model
9.7.1
Forecasting with
an AR Model
GˆT j t 0.975,df σ̂ j
where σ̂ j is the standard error of the forecast
error and df is the number of degrees of freedom
in the estimation of the AR model
9.7.1
Forecasting with
an AR Model
u1 GT 1 GˆT 1 δ δˆ θ1 θˆ1 GT θ2 θˆ 2 GT 1 vT 1
Ignoring the error from estimating the coefficients,
we get:
Eq. 9.66 u1 vT 1
9.7.1
Forecasting with
an AR Model
Eq. 9.67
u2 θ1 GT 1 GˆT 1 vT 2 θ1u1 vT 2 θ1vT 1 vT 2
Eq. 9.68
u3 θ1u2 θ2u1 vT 3 θ12 θ2 vT 1 θ1vT 2 vT 3
9.7.1
Forecasting with
an AR Model
9.7.1
Forecasting with
an AR Model
9.7.2
Forecasting with
an ARDL Model
9.7.2
Forecasting with
an ARDL Model
– Rearranging:
9.7.2
Forecasting with
an ARDL Model
9.7.3
Exponential
Smoothing
9.7.3
Exponential
Smoothing
yT yT 1 yT 2
yˆT 1
3
– This forecasting rule is an example of a simple
(equally-weighted) moving average model with
k=3
9.7.3
Exponential
Smoothing
yˆT 1 αyT α 1 α yT 1 α 1 α yT 2
1 2
Eq. 9.72
α 1 α 1
s
s 0
9.7.3
Exponential
Smoothing
1 α yˆT α 1 α yT 1 α 1 α yT 2 α 1 α yT 3
2 3
Eq. 9.73
9.7.3
Exponential
Smoothing
The value of α can reflect one’s judgment about
the relative weight of current information
– It can be estimated from historical information
by obtaining within-sample forecasts:
9.7.3
Exponential
Smoothing
9.7.3
Exponential
Smoothing
9.7.3
Exponential
Smoothing
Eq. 9.77 yt 1 yt 1 p yt p 0 xt 1 xt 1 q xt q vt
Eq. 9.78 yt α β0 x t + β1 xt 1 β2 xt 2 β3 xt 3 et
β
j 0
j s period interim multiplier
β
j 0
j total multiplier
yt 1 Lyt 2 L2 yt p Lp yt 0 xt 1 Lxt 2 L2 xt
Eq. 9.79
q Lq xt vt
Rearranging terms:
Eq. 9.80 1 2
1 L L2
p Lp yt 0 1L 2 L2 q Lq xt vt
1 θ1 L δ0 δ1L
1
Eq. 9.86 β0 β1 L β 2 L β3 L
2 3
et 1 θ1 L vt
1
Eq. 9.87
δ0 δ1 L 1 θ1L β 0 β1 L β 2 L2 β3 L3
β0 β1 L β 2 L2 β3 L3
Eq. 9.88
β0θ1 L β1θ1 L2 β 2θ1 L3
β0 β1 β0θ1 L β 2 β1θ1 L2 β3 β 2θ1 L3
β
j 0
j
1 θ1
0.184084
1 0.350116
0.4358
Eq. 9A.1 d t 2
T
t
ˆ
e 2
t 1
d t 2 t 2
T
t 2
t
ˆ
e 2
t 1
T T T
eˆ eˆ eˆt eˆt 1
Eq. 9A.2 2 2
t t 1
t 2
T
t 2
T
2 t 2
T
t
ˆ
e 2
t
ˆ
e 2
t
ˆ
e 2
t 1 t 1 t 1
1 1 2r1
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 157
9A
The Durbin-
Watson Test
Eq. 9A.3 d 2 1 r1
– If the estimated value of ρ is r1 = 0, then the
Durbin-Watson statistic d ≈ 2
• This is taken as an indication that the model
errors are not autocorrelated
– If the estimate of ρ happened to be r1 = 1 then
d≈0
• A low value for the Durbin-Watson statistic
implies that the model errors are correlated,
and ρ > 0
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 158
9A
The Durbin-
Watson Test FIGURE 9A.1 Testing for positive autocorrelation
9A.1
The Durbin-
Watson Bounds
Test
9A.1
The Durbin-
Watson Bounds
Test
Note that:
et ρet 1 vt
ρ ρet 2 vt 1 vt
Eq. 9B.1
ρ 2 et 2 ρvt 1 vt
Further substitution shows that:
et ρ ρet 3 vt 2 ρvt 1 vt
2
Eq. 9B.2
ρ3et 3 ρ 2 vt 2 ρvt 1 vt
ρE v ρ E v ρ E v
2
t 1
3 2
t 2
5 2
t 3
ρ 1 ρ ρ
2
v
2 4
ρv2
1 ρ2
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Stationary Variables
Page 165
9B
Properties of the
AR(1) Error
ρ k v2
cov et , et k k 0
1 ρ2
yt 1 2 xt et et et 1 vt
– Rearranging terms:
1 2 y1 1 2 1 1 2 x12 1 2 e1
Or:
where
y1 1 2 y1 x11 1 2
Eq. 9C.6
x12 1 2 x1 e1 1 2 e1