Stata The Stata listserver
[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

st: xtlsdvc, number of lags, time dummies

From   "Ivan Tasic" <[email protected]>
To   <[email protected]>
Subject   st: xtlsdvc, number of lags, time dummies
Date   Mon, 4 Jul 2005 17:00:01 -0500


I have three questions. The basic model used for -xtlsdvc- is as follows:
y_it = gamma*y_i,t-1 + x'_it*beta + eta_i + e_it

1. Is there any way to specify how many lags of dependent variable we want to use in -xtlsdvc-, or one lag is the only option? What's the consequence of including additional lags manually - as regressors?

2. Would bias correction work properly if we use time dummies?

3. I use a large TSCS (n=67, T=398, weekly data), and the bias looks very small compared to -xtreg, fe-. I guess that asymptotics kick in pretty well. Since it took Stata 9 forever to calculate robust standard errors (I did use the highest level of precision O(1/NT^2), is there any advice how to speed it up, except to buy a better computer?

I would appreciate any help.

Ivan Tasic

* For searches and help try:

© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index