Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

From |
David Hoaglin <dchoaglin@gmail.com> |

To |
statalist@hsphsun2.harvard.edu |

Subject |
Re: st: Statistical tests under heteroskedasticity |

Date |
Mon, 14 May 2012 11:08:13 -0400 |

Dave, What's the part about normally distributed coefficients? In the usual linear regression models, the coefficients are assumed to be constants. If the departure from constant variance (of the errors) is systematic, it may point to a need for a transformation of the dependent variable. David Hoaglin On Mon, May 14, 2012 at 10:34 AM, Airey, David C <david.airey@vanderbilt.edu> wrote: > . > > I was just reviewing assumptions for linear regression (simple error structure). > > independence -- needed for all types of inference > normally distributed coefficients -- needed for all types of inference > constant variance -- needed for all types of inference; can be relaxed by robust standard errors > correct mean model -- needed for group comparison and prediction > normal residuals -- needed for prediction; not fixed by robust standard errors > > So we still want the correct mean model before doing tests between group means using robust standard errors. We may not have the correct mean model if important covariates are missing. > > If we are not looking at group comparisons, but just the first order linear trend for a test of association (with a continuous or ordered X variable) using robust standard errors doesn't require the correct mean model. > > Your hypothesis and what you want to conclude is important to communicate to the list. > > You can see below the test command uses the robust standard error when called in the regression model, by running the same model without robust standard errors. > > sysuse auto > ttest weight, by(foreign) welch > regress weight i.foreign, vce(robust) > test 1.foreign // square the regression t to get the F > regress weight i.foreign > test 1.foreign // square the regression t to get the F > > I really like the -contrast- command in Stata 12 (an improved alternative to -test- in Stata 11). > > -Dave * * For searches and help try: * http://www.stata.com/help.cgi?search * http://www.stata.com/support/statalist/faq * http://www.ats.ucla.edu/stat/stata/

**References**:**re: st: Statistical tests under heteroskedasticity***From:*"Airey, David C" <david.airey@vanderbilt.edu>

- Prev by Date:
**st: mata function for "lookup" or find rank if observation not in the ranked sample** - Next by Date:
**Re: st: Using local(levels) to populate a new variable following parmest** - Previous by thread:
**re: st: Statistical tests under heteroskedasticity** - Next by thread:
**Re: st: Statistical tests under heteroskedasticity** - Index(es):