Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: radical change in t-stat, sign and significance


From   Nick Cox <[email protected]>
To   [email protected]
Subject   Re: st: radical change in t-stat, sign and significance
Date   Mon, 4 Apr 2011 09:21:32 +0100

Here is a simple example with real data:

. sysuse auto, clear
(1978 Automobile Data)

. regress price weight

      Source |       SS       df       MS              Number of obs =      74
-------------+------------------------------           F(  1,    72) =   29.42
       Model |   184233937     1   184233937           Prob > F      =  0.0000
    Residual |   450831459    72  6261548.04           R-squared     =  0.2901
-------------+------------------------------           Adj R-squared =  0.2802
       Total |   635065396    73  8699525.97           Root MSE      =  2502.3

------------------------------------------------------------------------------
       price |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
      weight |   2.044063   .3768341     5.42   0.000     1.292857    2.795268
       _cons |  -6.707353    1174.43    -0.01   0.995     -2347.89    2334.475
------------------------------------------------------------------------------

. gen weightsq = weight^2

. regress price weight*

      Source |       SS       df       MS              Number of obs =      74
-------------+------------------------------           F(  2,    71) =   23.09
       Model |   250285462     2   125142731           Prob > F      =  0.0000
    Residual |   384779934    71  5419435.69           R-squared     =  0.3941
-------------+------------------------------           Adj R-squared =  0.3770
       Total |   635065396    73  8699525.97           Root MSE      =    2328

------------------------------------------------------------------------------
       price |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
      weight |  -7.273097   2.691747    -2.70   0.009    -12.64029   -1.905906
    weightsq |   .0015142   .0004337     3.49   0.001     .0006494     .002379
       _cons |    13418.8   3997.822     3.36   0.001     5447.372    21390.23
------------------------------------------------------------------------------

You need to plot the data to see what is going on.

. twoway lfit price weight || qfit price weight || scatter price weight,
legend(order(1 "linear" 2 "quadratic") pos(11) ring(0) col(1))
ytitle("Price (USD)")

Thise who live by r-squareds,  P-values and t-statistics would
probably be quite happy with the quadratic model, but it is still a
mediocre model for these data and suggests structure -- a  turning
point within the range of the data -- that is implausible. Not the
fault of the quadratic, as that is its nature, but a poor choice
nevertheless.

Nick

On Fri, Apr 1, 2011 at 7:21 PM, Joerg Luedicke <[email protected]> wrote:
> On Fri, Apr 1, 2011 at 1:59 PM, Fabio Zona <[email protected]> wrote:
>> Dear all,
>>
>> I have a regression (zero inflated negative binomial): when I include the linear predictor alone (without its square term), the coefficient of this linear predictor is negative and significant.
>> However, when I introduce the square term of the same predictor: a) the linear one changes its sign, becomes positive, and it is still significant; b) the square term gets a negative sign and is signficant.
>>
>> Is this radical change in sign and significance of the linear coefficient a signal of some problems in the model?
>>
>
> Hi,
>
> You cannot interpret that as a "change in sign of the linear
> coefficient". Once you include the squared term you cant interpret the
> two coefficients in isolation, they only make sense together. In your
> case, you found an inverse u-shape kind of relation between your
> covariate and your dependent variable: Your count is going up for some
> lower part range of your covariate but then going down. Usually best
> is to plot the effect to get a better sense of how it exactly looks
> like. But what you find is not contradictory in any way.
>
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index