Statalist The Stata Listserver


[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

Re: st: weighted least squares with time dummy variables


From   "Clive Nicholas" <[email protected]>
To   [email protected]
Subject   Re: st: weighted least squares with time dummy variables
Date   Wed, 24 Jan 2007 05:51:19 -0000 (GMT)

Do Han Kim wrote:

> I'm trying to run weighted least square with time dummy variables.

[...]

> My question is that, when I change the reference group, the regression
> coefficients also change.
> Could you please explain why?
> From my understanding, changing reference group should not influence on
> regression coefficients.

Not quite sure how or where you picked up this misunderstanding:

. webuse grunfeld, clear

. tab time, gen(t)

       time |      Freq.     Percent        Cum.
------------+-----------------------------------
          1 |         10        5.00        5.00
          2 |         10        5.00       10.00
          3 |         10        5.00       15.00
          4 |         10        5.00       20.00
          5 |         10        5.00       25.00
          6 |         10        5.00       30.00
          7 |         10        5.00       35.00
          8 |         10        5.00       40.00
          9 |         10        5.00       45.00
         10 |         10        5.00       50.00
         11 |         10        5.00       55.00
         12 |         10        5.00       60.00
         13 |         10        5.00       65.00
         14 |         10        5.00       70.00
         15 |         10        5.00       75.00
         16 |         10        5.00       80.00
         17 |         10        5.00       85.00
         18 |         10        5.00       90.00
         19 |         10        5.00       95.00
         20 |         10        5.00      100.00
------------+-----------------------------------
      Total |        200      100.00

Notice that there are equal numbers of observations in each time dummy in
this data.

. g weight=(1/invnorm(uniform()))^2

. reg invest t2-t20 [pw=weight]
(sum of wgt is   1.0540e+04)

Linear regression                                    Number of obs =     200
                                                     F( 19,   180) =  437.53
                                                     Prob > F      =  0.0000
                                                     R-squared     =  0.6361
                                                     Root MSE      =  82.401

----------------------------------------------------------------------------
           |               Robust
    invest |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-----------+----------------------------------------------------------------
        t2 |   160.1355   97.52668     1.64   0.102    -32.30721    352.5781
        t3 |  -50.43065   53.12961    -0.95   0.344    -155.2676    54.40634
        t4 |  -91.05831   52.99991    -1.72   0.088    -195.6394    13.52275
        t5 |   23.48066   101.7299     0.23   0.818    -177.2559    224.2173
        t6 |  -91.37126   53.00416    -1.72   0.086    -195.9607    13.21819
        t7 |   30.59899   102.2527     0.30   0.765    -171.1692    232.3672
        t8 |  -65.49078   56.19894    -1.17   0.245    -176.3843     45.4027
        t9 |  -61.20838   54.64624    -1.12   0.264     -169.038    46.62126
       t10 |    131.403   69.50193     1.89   0.060    -5.740389    268.5463
       t11 |  -92.06036    58.5703    -1.57   0.118    -207.6331    23.51236
       t12 |   151.1909   134.5932     1.12   0.263    -114.3926    416.7745
       t13 |   365.5336    96.6863     3.78   0.000     174.7492     556.318
       t14 |  -10.58412   63.97232    -0.17   0.869    -136.8163     115.648
       t15 |  -67.78393   58.72797    -1.15   0.250    -183.6678    48.09991
       t16 |  -69.49276   55.74378    -1.25   0.214    -179.4881    40.50258
       t17 |   19.29812   53.70033     0.36   0.720    -86.66503    125.2613
       t18 |   26.87425   53.01444     0.51   0.613    -77.73547     131.484
       t19 |   216.3049    209.707     1.03   0.304    -197.4954    630.1052
       t20 |   104.5794   108.5028     0.96   0.336    -109.5217    318.6805
     _cons |   118.8594   52.99888     2.24   0.026     14.28041    223.4384
----------------------------------------------------------------------------

. reg invest t1-t19 [pw=weight]
(sum of wgt is   1.0540e+04)

Linear regression                                    Number of obs =     200
                                                     F( 19,   180) =  437.53
                                                     Prob > F      =  0.0000
                                                     R-squared     =  0.6361
                                                     Root MSE      =  82.401

----------------------------------------------------------------------------
           |               Robust
    invest |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-----------+----------------------------------------------------------------
        t1 |  -104.5794   108.5028    -0.96   0.336    -318.6805    109.5217
        t2 |   55.55602   125.1661     0.44   0.658    -191.4256    302.5376
        t3 |  -155.0101   94.75154    -1.64   0.104    -341.9767    31.95657
        t4 |  -195.6377   94.67887    -2.07   0.040     -382.461   -8.814478
        t5 |  -81.09877   128.4682    -0.63   0.529    -334.5961    172.3986
        t6 |  -195.9507   94.68125    -2.07   0.040    -382.7787   -9.122731
        t7 |  -73.98044   128.8825    -0.57   0.567    -328.2955    180.3346
        t8 |  -170.0702   96.50605    -1.76   0.080    -360.4989    20.35851
        t9 |  -165.7878   95.61019    -1.73   0.085    -354.4488    22.87317
       t10 |   26.82353   104.8123     0.26   0.798    -179.9953    233.6424
       t11 |  -196.6398   97.90596    -2.01   0.046    -389.8309   -3.448725
       t12 |   46.61149   155.7897     0.30   0.765    -260.7976    354.0206
       t13 |   260.9541   124.5124     2.10   0.037     15.26244    506.6459
       t14 |  -115.1636   101.2302    -1.14   0.257    -314.9141    84.58702
       t15 |  -172.3634   98.00037    -1.76   0.080    -365.7407    21.01399
       t16 |  -174.0722   96.24171    -1.81   0.072    -363.9793    15.83491
       t17 |  -85.28131   95.07273    -0.90   0.371    -272.8817    102.3191
       t18 |  -77.70518     94.687    -0.82   0.413    -264.5445    109.1341
       t19 |   111.7254    223.902     0.50   0.618    -330.0849    553.5358
     _cons |   223.4389   94.67829     2.36   0.019     36.61674     410.261
----------------------------------------------------------------------------

OK, now I know this is WOLS and not WLS, but you get the drift: changing
values of parameter estimates along with changing which time dummies you
have in your model is perfectly normal, regardless. Notice that the model
diagnostics are exactly the same.

Whenever I've run a model with dummies, I've _never_ had to worry about
which one of them would make the most sense to drop, largely because:

(1) I know which one will be omitted beforehand;

and

(2) my problem is normally keeping them all _in_ my model! You should thank
    your lucky stars...

Anyway, if you want to perform WLS regressions, you may find Phil Ender's
-wls0- routine, downloadable from SSC, of use:

. wls0 invest t2-t20, wvar(weight) type(abse)

WLS regression -  type: proportional to abs(e)

(sum of wgt is   1.3769e+00)

      Source |       SS       df       MS            Number of obs =     200
-------------+------------------------------         F( 19,   180) =    0.69
       Model |  638334.902    19  33596.5738         Prob > F      =  0.8233
    Residual |  8730858.56   180  48504.7698         R-squared     =  0.0681
-------------+------------------------------         Adj R-squared = -0.0302
       Total |  9369193.46   199  47081.3742         Root MSE      =  220.24

----------------------------------------------------------------------------
    invest |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-----------+----------------------------------------------------------------
        t2 |   28.91055   98.52813     0.29   0.770    -165.5082    223.3293
        t3 |   50.02784   98.53465     0.51   0.612    -144.4038    244.4595
        t4 |   5.039467   98.47842     0.05   0.959    -189.2812    199.3601
        t5 |   7.772377   98.39881     0.08   0.937    -186.3912    201.9359
        t6 |   40.05533   98.55082     0.41   0.685    -154.4082    234.5188
        t7 |   66.91102   98.50374     0.68   0.498    -127.4596    261.2816
        t8 |   49.90693   98.47082     0.51   0.613    -144.3987    244.2126
        t9 |   45.60642    98.4746     0.46   0.644    -148.7067    239.9195
       t10 |   48.16045   98.39934     0.49   0.625    -146.0042    242.3251
       t11 |   51.50525   98.55601     0.52   0.602    -142.9685     245.979
       t12 |   88.94119   98.54166     0.90   0.368    -105.5042    283.3866
       t13 |   73.73731   98.47266     0.75   0.455     -120.572    268.0466
       t14 |   81.49318   98.57705     0.83   0.410    -113.0221    276.0084
       t15 |   68.16292   98.41564     0.69   0.489    -126.0338    262.3597
       t16 |   76.87898    97.9299     0.79   0.433    -116.3593    270.1173
       t17 |    127.194   98.57688     1.29   0.199    -67.32097    321.7089
       t18 |   152.1969   98.51953     1.54   0.124    -42.20491    346.5987
       t19 |   207.6927   99.13833     2.09   0.038     12.06987    403.3155
       t20 |   201.0773   98.48557     2.04   0.043     6.742499     395.412
     _cons |   72.65918   69.65713     1.04   0.298    -64.79042    210.1088
----------------------------------------------------------------------------

The above output is garbage (note the negative adjusted R^2): yours
shouldn't be! Alternatively, there's also -xtreg, be wls-. Whether either
solution fits your data is a question only you can answer.

CLIVE NICHOLAS        |t: 0(044)7903 397793
Politics              |e: [email protected]
Newcastle University  |http://www.ncl.ac.uk/geps

Whereever you go and whatever you do, just remember this. No matter how
many like you, admire you, love you or adore you, the number of people
turning up to your funeral will be largely determined by local weather
conditions.

*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index