Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: st: calculating effect sizes when using svy command


From   "Vogt, Dawne" <[email protected]>
To   "'[email protected]'" <[email protected]>
Subject   RE: st: calculating effect sizes when using svy command
Date   Tue, 7 Dec 2010 16:11:28 -0500

Thanks. So it sounds like I can take the square root of the R squared value to get the correlation coefficient for a regression with 1 predictor. But how do I get effect size indicators (preferably in the form of correlation coefficients) for each predictor in a regression with multiple predictors?


-----Original Message-----
From: [email protected] [mailto:[email protected]] On Behalf Of Steven Samuels
Sent: Tuesday, December 07, 2010 4:00 PM
To: [email protected]
Subject: Re: st: calculating effect sizes when using svy command

--
-
I should have added:  The  relation of (partial) r-squares to t- 
statistics holds only for ordinary least squares, not for the  
estimation formulas of survey regression. So, neither of your  
calculated r's is correct.

Steve

On Dec 7, 2010, at 3:31 PM, Vogt, Dawne wrote:

I have two questions related to calculating effect sizes using svyreg  
(pweights):

First, when doing unweighted regressions in SPSS, I like to provide  
effect sizes for each predictor by calculating a correlation  
coefficient value (r) from the t values provided in the output. I like  
using r because it is easy for most people to interpret. Can I do the  
same using svyreg output?

My second question is related to the first. Since there is no  
correlation option under the svy commands, I have been computing  
regressions of Y on X and X and Y and using the largest p value of the  
two sets of results, as recommended elsewhere. I've having trouble  
figuring out how to convert the results provided in the output to a  
correlation coefficient though.  I noticed that the r value I get by  
taking the square root of the R squared is different from my own hand  
calculation of r derived from the t value provided in the regression  
output [sqrt of (t squared divided by t squared + df). I'm not sure  
which r is correct (or if either of them are correct).

Thanks in advance for any guidance others may be able to offer.


*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index