Session Heteroscedasticity

Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

Heteroscedasticity

DR. INDRA, S.Si, M.Si


What is Heteroskedasticity
Review the assumption of Gauss-Markov
1. Linear Regression Model
y =  1 +  2x + e This is the assumption
of a homoskedastic error
2. Error Term has a mean of zero:
E(e) = 0 → E(y) = 1 + 2x
3. Data on X are not random and thus are uncorrelated with the
error term: Cov(X,e) = E(Xe) = 0
4. Error term has constant variance:
A homoskedastic error is one that
Var(e) = E(e2) = 2 has constant variance.
5. Error term is not correlated A heteroskedastic error is one that
with itself (no serial correlation): has a nonconstant variance.
Cov(ei,ej) = E(eiej) = 0 ij

Heteroskedasticity is more commonly a problem for cross-section data


sets, although a time-series model can also have a non-constant variance.
What is Heteroskedasticity

Hetero (different or unequal) is the opposite of Homo


(same or equal)…

Skedastic means spread or scatter…

Homoskedasticity = equal spread


Heteroskedasticity = unequal spread
Heteroscedasticity
Homoscedasticity vs Heteroscedasticity
Homoscedasticity vs Heteroscedasticity
HETEROSCEDASTICITY

➢ One of the assumptions of the classical linear regr


ession (CLRM) is that the variance of ui, the error
term, is constant, or homoscedastic.
➢ Reasons are many, including:
➢The presence of outliers in the data
➢Incorrect functional form of the regression model
➢Incorrect transformation of data
➢Mixing observations with different measures of scale (
such as mixing high-income households with low-inco
me households).
CONSEQUENCES

➢ If heteroscedasticity exists, several consequences


ensue:
➢ The OLS estimators are still unbiased and consistent, yet
the estimators are less efficient, making statistical inference
less reliable (i.e., the estimated t values may not be reliable)
➢ Thus, estimators are not best linear unbiased estimators
(BLUE); they are simply linear unbiased estimators (LUE).
➢ In the presence of heteroscedasticity, the BLUE estimators
are provided by the method of weighted least squares
(WLS).
DETECTION OF HETEROSCEDASTICITY

➢Graph squared residuals against predicted Y


➢Park Test
➢Glejser Test
➢Harvey-Godfrey Test
➢Breusch-Pagan-Godfrey Test
➢Engle’s ARCH Test
➢White’s Test
Graph Squared Residuals Against Predicted Y

Heteroscedastic Heteroscedastic

Homoscedastic

Heteroscedastic
Heteroscedastic
The Park LM Test

Step 1: Estimate the model by OLS and obtain the residuals


Step 2: Run the following auxiliary regression:

ln uˆt2 = a1 + a2 ln Z 2t + a3 ln Z 3t + ... + a p ln Z pt + vt

Step 3: Compute LM=nR2, where n and R2 are from the


auxiliary regression.
Step 4: If LM-stat>χ2p-1 critical reject the null and conclude
that there is significant evidence of heteroskedasticity
The Glesjer LM Test

Step 1: Estimate the model by OLS and obtain the residuals


Step 2: Run the following auxiliary regression:

| uˆt |= a1 + a2 Z 2t + a3 Z 3t + ... + a p Z pt + vt

Step 3: Compute LM=nR2, where n and R2 are from the auxili


ary regression.
Step 4: If LM-stat>χ2p-1 critical reject the null and conclude th
at there is significant evidence of heteroskedasticity
The Glesjer LM Test
Dependent Variable: URATE
Method: Least Squares
Date: 03/08/19 Time: 20:11
Sample: 1959M01 1989M12
Included observations: 372

Variable Coefficient Std. Error t-Statistic Prob.

IP 0.088333 0.003924 22.50822 0.0000


M1 -0.029656 0.001058 -28.03067 0.0000
PPI 0.207517 0.008114 25.57416 0.0000
TB3 -0.533460 0.034155 -15.61888 0.0000

R-squared 0.576320 Mean dependent var 6.069086


Adjusted R-squared 0.572866 S.D. dependent var 1.592532
S.E. of regression 1.040807 Akaike info criterion 2.928564
Sum squared resid 398.6468 Schwarz criterion 2.970703
Log likelihood -540.7130 Hannan-Quinn criter. 2.945299
Durbin-Watson stat 0.107131

Heteroskedasticity Test: Glejser

F-statistic 16.40075 Prob. F(4,367) 0.0000


Obs*R-squared 56.41273 Prob. Chi-Square(4) 0.0000
Scaled explained SS 44.33222 Prob. Chi-Square(4) 0.0000
The Harvey-Godfrey LM Test

Step 1: Estimate the model by OLS and obtain the residuals


Step 2: Run the following auxiliary regression:
ln uˆt2 = a1 + a2 Z 2t + a3 Z 3t + ... + a p Z pt + vt

Step 3: Compute LM=nR2, where n and R2 are from the auxili


ary regression.
Step 4: If LM-stat>χ2p-1 critical reject the null and conclude th
at there is significant evidence of heteroskedasticity
The Harvey-Godfrey LM Test
Dependent Variable: URATE
Method: Least Squares
Date: 03/08/19 Time: 20:11
Sample: 1959M01 1989M12
Included observations: 372

Variable Coefficient Std. Error t-Statistic Prob.

IP 0.088333 0.003924 22.50822 0.0000


M1 -0.029656 0.001058 -28.03067 0.0000
PPI 0.207517 0.008114 25.57416 0.0000
TB3 -0.533460 0.034155 -15.61888 0.0000

R-squared 0.576320 Mean dependent var 6.069086


Adjusted R-squared 0.572866 S.D. dependent var 1.592532
S.E. of regression 1.040807 Akaike info criterion 2.928564
Sum squared resid 398.6468 Schwarz criterion 2.970703
Log likelihood -540.7130 Hannan-Quinn criter. 2.945299
Durbin-Watson stat 0.107131

Heteroskedasticity Test: Harvey

F-statistic 12.78525 Prob. F(4,367) 0.0000


Obs*R-squared 45.49771 Prob. Chi-Square(4) 0.0000
Scaled explained SS 37.10664 Prob. Chi-Square(4) 0.0000
The Breusch-Pagan-Godfrey Test

Step 1: Estimate the model by OLS and obtain the residuals


Step 2: Run the following auxiliary regression:

uˆt2 = a1 + a2 Z 2t + a3 Z 3t + ... + a p Z pt + vt
Step 3: Compute LM=nR2, where n and R2 are from the auxilia
ry regression.
Step 4: If LM-stat>χ2p-1 critical reject the null and conclude tha
t there is significant evidence of heteroskedasticity
The Breusch-Pagan-Godfrey Test
Dependent Variable: URATE
Method: Least Squares
Date: 03/08/19 Time: 20:11
Sample: 1959M01 1989M12
Included observations: 372

Variable Coefficient Std. Error t-Statistic Prob.

IP 0.088333 0.003924 22.50822 0.0000


M1 -0.029656 0.001058 -28.03067 0.0000
PPI 0.207517 0.008114 25.57416 0.0000
TB3 -0.533460 0.034155 -15.61888 0.0000

R-squared 0.576320 Mean dependent var 6.069086


Adjusted R-squared 0.572866 S.D. dependent var 1.592532
S.E. of regression 1.040807 Akaike info criterion 2.928564
Sum squared resid 398.6468 Schwarz criterion 2.970703
Log likelihood -540.7130 Hannan-Quinn criter. 2.945299
Durbin-Watson stat 0.107131

Heteroskedasticity Test: Breusch-Pagan-Godfrey

F-statistic 13.59443 Prob. F(4,367) 0.0000


Obs*R-squared 48.00565 Prob. Chi-Square(4) 0.0000
Scaled explained SS 31.61716 Prob. Chi-Square(4) 0.0000
The Engle’s ARCH Test

Engle introduced a new concept allowing for heteroske


dasticity to occur in the variance of the error terms,
rather than in the error terms themselves.
The key idea is that the variance of ut depends on the
size of the squarred error term lagged one period u2t-1
for the first order model or:
Var(ut)=γ1+γ2u2t-1
The model can be easily extended for higher orders:
Var(ut)=γ1+γ2u2t-1+…+ γpu2t-p
The Engle’s ARCH Test

Step 1: Estimate the model by OLS and obtain the


residuals
Step 2: Regress the squared residuals to a constant and
lagged terms of squared residuals, the number of lags will
be determined by the hypothesized order of ARCH effects.
Step 3: Compute the LM statistic = (n-ρ)R2 from the LM
model and compare it with the chi-square critical value.
Step 4: Conclude
The Engle’s ARCH Test
Dependent Variable: URATE
Method: Least Squares
Date: 03/08/19 Time: 20:11
Sample: 1959M01 1989M12
Included observations: 372

Variable Coefficient Std. Error t-Statistic Prob.

IP 0.088333 0.003924 22.50822 0.0000


M1 -0.029656 0.001058 -28.03067 0.0000
PPI 0.207517 0.008114 25.57416 0.0000
TB3 -0.533460 0.034155 -15.61888 0.0000

R-squared 0.576320 Mean dependent var 6.069086


Adjusted R-squared 0.572866 S.D. dependent var 1.592532
S.E. of regression 1.040807 Akaike info criterion 2.928564
Sum squared resid 398.6468 Schwarz criterion 2.970703
Log likelihood -540.7130 Hannan-Quinn criter. 2.945299
Durbin-Watson stat 0.107131

Heteroskedasticity Test: ARCH

F-statistic 1129.631 Prob. F(1,369) 0.0000


Obs*R-squared 279.6507 Prob. Chi-Square(1) 0.0000
The White’s Test

Step 1: Estimate the model by OLS and obtain the residuals


Step 2: Run the following auxiliary regression:

uˆt2 = a1 + a2 X 2t + a3 X 3t + a4 X 22t + a5 X 32t + a6 X 2t X 3t + vt

Step 3: Compute LM=nR2, where n and R2 are from the au


xiliary regression.
Step 4: If LM-stat>χ2p-1 critical reject the null and conclude
that there is significant evidence of heteroskedasticity
The Engle’s ARCH Test
Dependent Variable: URATE
Method: Least Squares
Date: 03/08/19 Time: 20:11
Sample: 1959M01 1989M12
Included observations: 372

Variable Coefficient Std. Error t-Statistic Prob.

IP 0.088333 0.003924 22.50822 0.0000


M1 -0.029656 0.001058 -28.03067 0.0000
PPI 0.207517 0.008114 25.57416 0.0000
TB3 -0.533460 0.034155 -15.61888 0.0000

R-squared 0.576320 Mean dependent var 6.069086


Adjusted R-squared 0.572866 S.D. dependent var 1.592532
S.E. of regression 1.040807 Akaike info criterion 2.928564
Sum squared resid 398.6468 Schwarz criterion 2.970703
Log likelihood -540.7130 Hannan-Quinn criter. 2.945299
Durbin-Watson stat 0.107131

Heteroskedasticity Test: White

F-statistic 13.87367 Prob. F(10,361) 0.0000


Obs*R-squared 103.2745 Prob. Chi-Square(10) 0.0000
Scaled explained SS 68.01797 Prob. Chi-Square(10) 0.0000
Resolving Heteroskedasticity

We have three different cases:

(a) Generalized Least Squares


(b) Weighted Least Squares
(c) Heteroskedasticity-Consistent Estimation Met
hods
Generalized Least Squares

Consider the model

Yt=β1+β2X2t+β3X3t+β4X4t+…+βkXkt+ut

where
Var(ut)=σt2
Generalized Least Squares

If we divide each term by the standard deviation of the e


rror term, σt we get:
Yt=β1 (1/σt) +β2X2t/σt +β3X3t/σt +…+βkXkt/σt +ut/σt
or
Y*t= β*1+ β*2X*2t+ β*3X*3t+…+ β*kX*kt+u*t
Where we have now that:
Var(u*t)=Var(ut/σt)=Var(ut)/σt2=1
Weighted Least Squares

The GLS procedure is the same as the WLS where we h


ave weights, wt, adjusting our variables.
Define wt=1/σt, and rewrite the original model as:
wtYt=β1wt+β2X2twt+β3X3twt+…+βkXktwt+utwt
Where if we define as wtYt-1=Y*t and Xitwt=X*it
we get
Y*t= β*1+ β*2X*2t+ β*3X*3t+…+ β*kX*kt+u*t

You might also like