This might seem like a silly question but it stems from my inexperience with Stata. I've created (with a bit of help from StataCorp) a program that runs an ADF test for a series for a number of lags specified by the user. I get to a point where the AIC and BIC are reported in a matrix called "criteria". Here's the code I've written:
************************************************
program define adftest263, rclass
version 9
syntax varlist(max=1) [if] [in] [, /*
*/Lags(int-1)]
tokenize `varlist'
foreach var of local varlist{
forvalues i=1/`lags'{
display "ADF(`i') of `var'"
dfuller `var', constant lags(`i') regress
local list storage(`i')
estimates store adf_`i'_`var'
}
estimates stats `storage(`i')'
matrix s=r(S)
matrix list s
matrix criteria=J(`lags',2,.)
forvalues i=1/`lags' {
matrix criteria[`i',1]=s[`i',5]
matrix criteria[`i',2]=s[`i',6]
}
matrix list criteria
***** Up to here it works fine *********
}
end
***********************************************************
I want to now select the first column (I presume a command with j=1 and any `i'?) of this matrix and calculate the minimum. This minimum should
correspond to the "best fitting (by AIC) ADF lag length". I want to select that lag length and run it for the series to report the result.
Does anyone know how to do this? I'd greatly appreciate a hand. An example of what I'm trying to do:
The result of running the program with an ADF(2) would be a matrix displayed like this. The first column corresponds to the AIC and the second to the BIC, while the rows correspond to the number of lags in the ADF:
c1 c2
r1 1.5 1.6
r2 0.5 0.5