[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

st: re: program ADF test with Perron's procedure

From   Kit Baum <>
Subject   st: re: program ADF test with Perron's procedure
Date   Wed, 3 Oct 2007 11:46:56 -0400

Myriam said

I am trying to write a program to test unit root (ADF
test) by using Perron's procedure to choose the
maximum lag length included in the regression.

The idea is to perform a first regression with 8 lags.
If the absolute value of the t-statistic on the eighth
lagged term is less than -1,645 (i.e., approximate 5%
significance level in an asymptotic normal
distribution), then this term is dropped and the
procedure is repeated for the seventh lagged term. The
procedure is stopped when the last lagged term is
significant, in which case all lesser
first-differenced lagged terms remain in the test, or
zero lagged terms will result.

That logic is built in to the -dfgls- routine of Baum and Sperling and into official Stata's -dfgls-. The Elliott-Rothenberg-Stock DF- GLS procedure is considerably more powerful than any ADF test under a wide variety of circumstances. Why not just use -dfgls-, which determines the optimal lag length if you allow for plenty of lags?

Kit Baum, Boston College Economics and DIW Berlin
An Introduction to Modern Econometrics Using Stata:

* For searches and help try:

© Copyright 1996–2017 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index