Stata The Stata listserver
[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

Re: st: RE: Multicollinerity test in IV regression


From   Marcello Pagano <[email protected]>
To   [email protected]
Subject   Re: st: RE: Multicollinerity test in IV regression
Date   Wed, 13 Oct 2004 17:33:55 -0400

I agree somewhat with what Nick says, but I would like to take
it one step further. Indeed, mathematically one can think of the
predictors, over this particular sample region, as defining a linear
space. If the space is not of full rank, then there is multicollinearity.
That means that a linear combination of two, or more of the
predictor variables, or even linear combinations of two or more of
the predictor variables, equal zero.

In the finite precision world we live in with computers, zero is
interpreted to be relatively small. This is especially important
in the statistical world we live in where the predictors themselves
(big secret) may be known with error.

These linear combinations are not unique, but, and here is where
I disagree with Nick, they are informative. Possibly because it is
not clear that the solution is the simple one implied by Nick: drop one, or more, variables until we get rid of the problem. A better solution might be to replace all offending predictors
by (a) linear combination(s) that make sense. For example if
X1+X2=0, then drop both X1 and X2 and replace them with the
new variable X1+X2. You can extend this to the more general
situation.


Just a thought.

m.p.



Nick Cox wrote:


I assert that multicollinearity is a property of the predictors and does not depend on what
you do with them before, during or after any examination of multicollinearity.
You can look at the structure of relationships with -graph matrix-; get numerical summaries by using -correlate-; and use -_rmcoll- to look further.
Whether there is some omnibus test of multicollinearity I do not know. If there were, it wouldn't necessarily be helpful in indicating what to do.
I have always found it most useful to think about the meanings of variables and the roles they play in terms of the underlying science. That is, reflection often makes it seem unsurprising
that two or more variables are highly correlated, so that they tell the same story and only one need be recorded.
However, there are other issues particularly where lots of dummies are included.
Nick [email protected]
Rozilee Asid



Just to ask one simple question, how do I test for the existence of
multicollinearity after using ivreg2 command.


*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/

--
______________________________________________________________________

Marcello Pagano Biostatistics Department Tel: 1-617-432-4911
Harvard School of Public Health Fax: 1-617-432-5619 655 Huntington Avenue email:[email protected]
Boston, MA 02115 http://biosun1.harvard.edu/~bio200
USA

eppur si muove


*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/




© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index