Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: about residuals and coefficients


From   Daljit Dhadwal <[email protected]>
To   [email protected]
Subject   Re: st: about residuals and coefficients
Date   Tue, 3 Sep 2013 19:32:04 -0700

Another Stata command/article that may be helpful to you in
understanding/explaining the coefficients in the multiple regression
equation is the following: Regression anatomy, revealed by Valerio
Filoso in the Stata Journal (Volume 13 Number 1: pp. 92-106). If you
don’t have access to the Stata journal, there’s an older version of
the article here:
http://works.bepress.com/cgi/viewcontent.cgi?article=1010&context=valerio_filoso

Information on the command is available through: ssc des reganat

Daljit Dhadwal

On Tue, Sep 3, 2013 at 8:47 AM, John Antonakis <[email protected]> wrote:
> Right....dominance analysis will do the trick.
>
> See also -shapely- (by Stas Kolenikov), available from ssc:
>
> ssc des shapley
>
> Best
> J.
>
> __________________________________________
>
> John Antonakis
> Professor of Organizational Behavior
> Director, Ph.D. Program in Management
>
> Faculty of Business and Economics
> University of Lausanne
> Internef #618
> CH-1015 Lausanne-Dorigny
> Switzerland
> Tel ++41 (0)21 692-3438
> Fax ++41 (0)21 692-3305
> http://www.hec.unil.ch/people/jantonakis
>
> Associate Editor:
> The Leadership Quarterly
> Organizational Research Methods
> __________________________________________
>
>
> On 03.09.2013 16:08, Joseph Luchman wrote:
>> Hi Kayla,
>>
>>   I might also mention that your interest seems to move toward
>> evaluating the relative importance of the predictors in terms of how
>> they reduce prediction error, which gets into how much of an overall
>> metric such as the R^2 can be ascribed to a predictor.
>>
>>   As David mentioned there's no way to separate how much of the R^2 is
>> ascribed solely to one variable or the other unless they're orthogonal
>> - but relative importance methods do something similar to that and can
>> be interpreted along those lines.  One such method is available
>> through the - domin - (SSC) program I wrote in which the uncertainty
>> in ascribing R^2 to a predictor is resolved by averaging (giving both
>> predictors a portion).
>>
>>   There are some other metrics available in that module too - take a
>> look at - domin -'s help file, it may be of use to you for what you're
>> trying to do.
>>
>> - joe
>>
>> Joseph Nicholas Luchman, M.A.
>> ----
>> Behavioral Statistics Lead | Fors Marsh Group
>> Email: [email protected]
>> forsmarshgroup.com
>> ----
>> Doctoral Candidate
>> Industrial Organizational Psychology
>> George Mason University
>> https://www.researchgate.net/profile/Joseph_Luchman/
>>
>> On Mon, Sep 2, 2013 at 09:18 AM; Robson Glasscock <[email protected]>
>> wrote:
>>
>>> A log-level model specification allows one to directly interpret the
>>> percentage change in y per change in xi.
>>
>> On Mon, Sep 2, 2013 at 8:25 AM, David Hoaglin <[email protected]> wrote:
>>> Hi, Kayla.
>>>
>>> Your questions seem to be fairly basic ones about multiple regression.
>>>
>>> I hope you have looked at the three scatterplots (y vs. x1, y vs. x2,
>>> and x2 vs. x1) to see how the data behave.
>>>
>>> R^2 provides information equivalent to
>>> [sum(residual^2)]/[sum((y-ybar)^2)], often abbreviated as SSE/SST.
>>> R^2 = 1 - (SSE/SST) is the percentage of the (squared) variation in y
>>> that is accounted for by the regression model (i.e., by x1 and x2
>>> together).
>>>
>>> In general, it is not possible to express R^2 as the sum of a
>>> percentage accounted for by x1 and a percentage accounted for by x2.
>>> The obstacle is correlation (in the data) between x1 and x2.  Thus,
>>> you can say how much variation x2 accounts for after adjustment for
>>> x1, and you can say how much variation x1 accounts for after
>>> adjustment for x2.  To get those percentages, you can fit the simple
>>> regressions involving only x1 and only x2 and subtract the values of
>>> R^2 for those regressions from the value of R^2 for the regression
>>> involving both x1 and x2.  If x1 and x2 are uncorrelated (technically,
>>> orthogonal), usually by design, it is possible to express the R^2 of
>>> the two-variable model as the sum of the contributions of x1 and x2.
>>>
>>> I hope this discussion helps.
>>>
>>> David Hoaglin
>>>
>>> On Mon, Sep 2, 2013 at 5:57 AM, Kayla Bridge <[email protected]>
>>> wrote:
>>>> Dear all,
>>>> I am currently running a simple regression, and try to explain the
>>>> coefficients. The model and estimation results are the following.
>>>> y=5.41+1.24*x1+.28*x2, R2=0.7, N=20
>>>>  (0.58) (3.4)   (2.56)
>>>> The t-stats are in parentheses.
>>>> I'd like to know how much (in terms of percentage) of the change in y is
>>>> accounted for by change in x1, and how much change in y by change in x2.
>>>> Another question is: can I use [sum(residual^2)]/[sum((y-ybar)^2)],
>>>> where ybar is the mean value of the dependent variable, to say something
>>>> about percentage of residual, like smaller percentage of residuals implies
>>>> that x1 and x2 are good explanatory factors for y?
>>>> Any suggestion is greatly appreciated.
>>>> Best,
>>>> Kayla
>>
>> *
>> *   For searches and help try:
>> *   http://www.stata.com/help.cgi?search
>> *   http://www.stata.com/support/faqs/resources/statalist-faq/
>> *   http://www.ats.ucla.edu/stat/stata/
>
> *
> *   For searches and help try:
> *   http://www.stata.com/help.cgi?search
> *   http://www.stata.com/support/faqs/resources/statalist-faq/
> *   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/faqs/resources/statalist-faq/
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index