Statalist The Stata Listserver


[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

RE: st: Fixed Effect Estimation Results


From   "Schaffer, Mark E" <[email protected]>
To   <[email protected]>
Subject   RE: st: Fixed Effect Estimation Results
Date   Tue, 4 Apr 2006 18:54:18 +0100

Joana, 

> -----Original Message-----
> From: [email protected] 
> [mailto:[email protected]] On Behalf Of Joana Quina
> Sent: 04 April 2006 17:14
> To: [email protected]
> Subject: Re: st: Fixed Effect Estimation Results
> 
> Dear William,
> 
> Firstly, I apologise for not having reported my results.  The 
> ones you quote are the ones obtained by Sam, who initiated 
> this thread.
> 
> My results are shown below.  The corr is very high (-0.9899) 
> and the within R2 is 0.4587.  There are *no* extra variables 
> in the random-effects estimation.  A similar model 
> specification has been used in the literature.
> 
> Also, I found that it is the "lpop" variable that is driving 
> the high correlation result - omitting it surprisingly 
> *reduces* the correlation.
> 
> I have re-run my regressions several times and always obtain 
> the same results. I regularly check updates (a while ago 
> there was a problem with xtreg, which was corrected).

I wonder if this is being driven by the "omnibus" nature of your Hausman
test.  In your application, it has 12 degrees of freedom, one for each
coefficient.  I can imagine that, loosely speaking, if one coefficient
is "significantly different" between the 2 specifications, and the other
11 are "very similar", then your Hausman test could fail to reject the
null that the specifications are both consistent.

Just eyeballing your results suggests that the lpop variable could
indeed be behind this.  The difference in coefficients for this variable
is big (in relation to the SE column), which is not the case for the
other 11.

It is possible to do Hausman tests for subsets of coefficients.
Wooldridge's 2002 textbook, p. 290, discusses how to do this.  For one
coefficient, it's pretty easy:

H = (b_fe - b_re) / sqrt { [SE(b_fe)]^2 - [SE(b_re)]^2 }

This is easy when it's only one coefficient, and you can do it using
your -hausman- output.  The numerator is given in the "Difference"
column (-4.231998) and the denominator in the "S.E." column (2.216321).
The Hausman statistic for lpop alone is therefore H = -4.231998/2.216321
= -1.91, which is bordering on significant.

Hope this helps.

Cheers,
Mark

> Joana
> 
> . xtreg lbeda_pc lpop lgdp_pc_ppp elrsacw polity_n pts_s_n 
> corrupt milm_j us_un_fr
> > iend japan_un_friend uk_un_friend france_un_friend  indep, re
> 
> Random-effects GLS regression                   Number of obs 
>      =        96
> Group variable (i): id                          Number of 
> groups   =        27
> 
> R-sq:  within  = 0.4093                         Obs per 
> group: min =         2
>        between = 0.6133                                       
>  avg =       3.6
>        overall = 0.5911                                       
>  max =         4
> 
> Random effects u_i ~ Gaussian                   Wald chi2(12) 
>      =     73.52
> corr(u_i, X)       = 0 (assumed)                Prob > chi2   
>      =    0.0000
> 
> --------------------------------------------------------------
> ----------------
>     lbeda_pc |      Coef.   Std. Err.      z    P>|z|     
> [95% Conf. Interval]
> -------------+------------------------------------------------
> ----------
> -------------+------
>         lpop |  -.6168131   .1338557    -4.61   0.000    
> -.8791654   -.3544608
> lgdp_pc_ppp |  -.2690165   .1925044    -1.40   0.162    
> -.6463181    .1082851
>      elrsacw |   .1004888   .1544715     0.65   0.515    
> -.2022697    .4032473
>     polity_n |   .0293785   .0180981     1.62   0.105    
> -.0060931    .0648501
>      pts_s_n |   .0405079   .0644439     0.63   0.530    
> -.0857998    .1668155
>      corrupt |  -.1543259   .0422096    -3.66   0.000    
> -.2370552   -.0715967
>       milm_j |  -.0010754   .0014974    -0.72   0.473    
> -.0040103    .0018594
> us_un_friend |  -.0329801   .0127515    -2.59   0.010    
> -.0579725   -.0079877
> japan_un_f~d |   .0455285   .0226189     2.01   0.044     
> .0011962    .0898608
> uk_un_friend |   .0178181   .0290755     0.61   0.540    
> -.0391689    .0748052
> france_un_~d |   .0135801   .0212409     0.64   0.523    
> -.0280514    .0552116
>        indep |  -.0052071    .009476    -0.55   0.583    
> -.0237797    .0133656
>        _cons |   1.744271   2.436455     0.72   0.474    
> -3.031092    6.519634
> -------------+------------------------------------------------
> ----------
> -------------+------
>      sigma_u |  .68013164
>      sigma_e |  .29411034
>          rho |  .84246212   (fraction of variance due to u_i)
> --------------------------------------------------------------
> ----------------
> 
> . est store random2
> 
> . xtreg lbeda_pc lpop lgdp_pc_ppp elrsacw polity_n pts_s_n 
> corrupt milm_j us_un_fr
> > iend japan_un_friend uk_un_friend france_un_friend  indep, fe
> 
> Fixed-effects (within) regression               Number of obs 
>      =        96
> Group variable (i): id                          Number of 
> groups   =        27
> 
> R-sq:  within  = 0.4587                         Obs per 
> group: min =         2
>        between = 0.5312                                       
>  avg =       3.6
>        overall = 0.5104                                       
>  max =         4
> 
>                                                 F(12,57)      
>      =      4.03
> corr(u_i, Xb)  = -0.9899                        Prob > F      
>      =    0.0002
> 
> --------------------------------------------------------------
> ----------------
>     lbeda_pc |      Coef.   Std. Err.      t    P>|t|     
> [95% Conf. Interval]
> -------------+------------------------------------------------
> ----------
> -------------+------
>         lpop |  -4.848811    2.22036    -2.18   0.033    
> -9.295005   -.4026177
> lgdp_pc_ppp |  -.2184703   .3257575    -0.67   0.505    
> -.8707884    .4338478
>      elrsacw |   .3184507   .1787963     1.78   0.080    
> -.0395826    .6764841
>     polity_n |   .0598919   .0217173     2.76   0.008     
> .0164037    .1033801
>      pts_s_n |   .1013231    .074363     1.36   0.178    
> -.0475863    .2502325
>      corrupt |  -.1579303   .0489601    -3.23   0.002    
> -.2559712   -.0598893
>       milm_j |  -.0005835   .0016394    -0.36   0.723    
> -.0038663    .0026994
> us_un_friend |  -.0261931   .0134911    -1.94   0.057    
> -.0532086    .0008224
> japan_un_f~d |   .0407745   .0305656     1.33   0.188    
> -.0204322    .1019811
> uk_un_friend |   .0328692   .0323695     1.02   0.314    
> -.0319497    .0976881
> france_un_~d |  -.0031448   .0282499    -0.11   0.912    
> -.0597143    .0534247
>        indep |   .1052638   .0703668     1.50   0.140    
> -.0356432    .2461709
>        _cons |   6.981901   4.983844     1.40   0.167    
> -2.998075    16.96188
> -------------+------------------------------------------------
> ----------
> -------------+------
>      sigma_u |  4.8302321
>      sigma_e |  .29411034
>          rho |  .99630617   (fraction of variance due to u_i)
> --------------------------------------------------------------
> ----------------
> F test that all u_i=0:     F(26, 57) =    13.71              
> Prob > F = 0.0000
> 
> . est store fixed2
> 
> . hausman fixed2 random2
> 
>                  ---- Coefficients ----
>              |      (b)          (B)            (b-B)     
> sqrt(diag(V_b-V_B))
>              |     fixed2      random2       Difference          S.E.
> -------------+------------------------------------------------
> ----------
> -------------+------
>         lpop |   -4.848811    -.6168131       -4.231998       
>  2.216321
> lgdp_pc_ppp |   -.2184703    -.2690165        .0505462        .2627927
>      elrsacw |    .3184507     .1004888        .2179619       
>  .0900371
>     polity_n |    .0598919     .0293785        .0305134       
>  .0120043
>      pts_s_n |    .1013231     .0405079        .0608153       
>  .0371059
>      corrupt |   -.1579303    -.1543259       -.0036043       
>  .0248082
>       milm_j |   -.0005835    -.0010754         .000492       
>  .0006674
> us_un_friend |   -.0261931    -.0329801         .006787       
>  .0044057
> japan_un_f~d |    .0407745     .0455285        -.004754       
>  .0205583
> uk_un_friend |    .0328692     .0178181        .0150511       
>  .0142267
> france_un_~d |   -.0031448     .0135801       -.0167249       
>  .0186247
>        indep |    .1052638    -.0052071        .1104709       
>  .0697258
> --------------------------------------------------------------
> ----------------
>                            b = consistent under Ho and Ha; 
> obtained from xtreg
>             B = inconsistent under Ha, efficient under Ho; 
> obtained from xtreg
> 
>     Test:  Ho:  difference in coefficients not systematic
> 
>                  chi2(12) = (b-B)'[(V_b-V_B)^(-1)](b-B)
>                           =        9.04
>                 Prob>chi2 =      0.6999
> 
> On 04/04/06, William Gould, Stata <[email protected]> wrote:
> > Joana Quina <[email protected]> reports,
> >
> >    1.  She has estimated the parameters of a model using 
> -xtreg, fe- and
> >        that the reported correlation between u_i and X_ij*b 
> is .9249.
> >
> >    2.  She has estimated the parameters of the same same model on
> >        the same data.  She then performs a Hausman test 
> that fails to
> >        reject random effects.
> >
> > She writes,
> >
> > > It seems counter-intuitive. Any suggestions would be much 
> appreciated.
> >
> > It certainly does seem counterintuitive.  My first reaction is to 
> > suggest Joana check her work.
> >
> > Let's first understand just how counterintuitive this is.  The 
> > correlation between u_i and X_ij*b is .9249.  Now let's use an 
> > estimation method that constrains that correlation to be 0. 
>  X_ij is 
> > fixed, so the only thing that can give is b.  The estimated 
> b has got 
> > to change.  The Hausman tests basis its calculation on the 
> change in 
> > b, and it reports that the change is small, relative to variance.
> >
> > That could could mean is that the variance is large, so large as to 
> > suggest that the model, estimated either way, is not worth 
> much.  But 
> > Joana showed us
> > (1) and the within R^2 was .6385, so let's dimiss that.
> >
> > However, X_ij is *NOT* necessarily fixed.  Joana could have 
> included 
> > extra variables in the random-effects estimation, variables whose 
> > coefficients could not be estimated by the fixed-effects 
> estimation.  
> > In that case, the result is not counterintuitive at all.  
> Omit those 
> > variables, as done in the fixed-effects estimation, and u_i is 
> > correlated.  Include them, and the correlation vanishes.  Said 
> > differently, the subset of the b's estimated by both estimators did 
> > not change, and the extra b's estimated by the random-effects 
> > estimator eliminated the correlation.  This is exactly what 
> one hopes will happen if one has a well-specified model.
> >
> > Is that what happened?
> >
> > -- Bill
> > [email protected]
> > *
> > *   For searches and help try:
> > *   http://www.stata.com/support/faqs/res/findit.html
> > *   http://www.stata.com/support/statalist/faq
> > *   http://www.ats.ucla.edu/stat/stata/
> >
> 
> *
> *   For searches and help try:
> *   http://www.stata.com/support/faqs/res/findit.html
> *   http://www.stata.com/support/statalist/faq
> *   http://www.ats.ucla.edu/stat/stata/
> 
> 

*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index