Stata The Stata listserver
[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

Re: st: RE: Multicollinerity test in IV regression


From   Stas Kolenikov <[email protected]>
To   [email protected]
Subject   Re: st: RE: Multicollinerity test in IV regression
Date   Wed, 13 Oct 2004 14:56:13 -0400

On Wed, 13 Oct 2004 18:10:10 +0100, Nick Cox <[email protected]> wrote:
> I assert that multicollinearity is a property of the
> predictors and does not depend on what
> you do with them before, during or
> after any examination of multicollinearity.

Indeed, multicollinearity has nothing to do with the estimation
method, but rather an intrinsic property of the regressor
configuration. Any good regression book (not an econometric book!)
would have a discussion of multicollinearity. One of the basic
references (actually, written by economists) is Belsley, Kuh and
Welsh; other books to look at are Fox (it seems to me he is in
psychology, although I am not sure, hence his examples are more
pertinent for social sciences), or a classic text by Draper and Smith.
My advisor at UNC, Richard Smith, has compiled a very modern and
comprehensive text on regression, but just does not seem to have time
to polish it for a publication; otherwise, this would be the default
reference I would provide.

There are no "formal" tests on collinearity; all the measures are
ad-hoc. The most advanced one is to use singular value decomposition
of the regressor matrix and look at the singular values close to zero
-- they would correpond to the linear combinations that do not have
much variability, and thus cannot be estimated with sufficient
precision. Principal component analysis of the regressor matrix serves
the same purpose.

Stata has -vif- (variance inflation factors) command that shows by how
much the variance of the estimated coefficients goes up compared to
the imaginary case should regressors be orthogonal to each other.

The methods to deal with collinearity are tighlty related to the
variable selection methods, and regularization approaches, such as
ridge regression, principal components' regression, lasso, etc. Again,
Richard Smith's unpublished manuscript deals with them quite nicely,
and among the published sources, I would recommend Hastie, Tibshirani
and Friedman's book "The Elements of Statistical Learning".

-- 
Stas Kolenikov
http://stas.kolenikov.name
*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index