Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: multiple regression, r squared and normality of residuals


From   Nick Cox <[email protected]>
To   [email protected]
Subject   Re: st: multiple regression, r squared and normality of residuals
Date   Wed, 23 Mar 2011 22:06:53 +0000

-lnskew0- was presumably intended.


On Wed, Mar 23, 2011 at 8:56 PM, David Greenberg <[email protected]> wrote:
> Keith, I tried looking for the lnskew command and couldn't find it. Could you indicate where it is located? Thank you, David Greenberg, Sociology Department, New York University
>
> ----- Original Message -----
> From: Keith Dear <[email protected]>
> Date: Tuesday, March 22, 2011 11:48 pm
> Subject: Re: st: multiple regression, r squared and normality of residuals
> To: [email protected]
>
>
>> You could also try a sqrt transform, since the log seems to have
>> overcooked it: see -ladder-. And you appear to have seven zeros (or
>> negatives) which is why you are losing N: a common solution is to use
>> log(1+x); or try -lnskew-
>>
>> However since your original residuals appeared normal, why are you
>> transforming the dependent variable at all? You probably have
>> something other than a straight-line dependence on one or more
>> covariates, so should be concentrating on the RHS of the equation not
>> the left. -fracpoly- and -mfp- may help, or or just lots of
>> scatterplots. Also see -help regress postestimation- for -avplot- and
>> others.
>>
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index