Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: calculating effect sizes when using svy command


From   Steven Samuels <[email protected]>
To   [email protected]
Subject   Re: st: calculating effect sizes when using svy command
Date   Tue, 7 Dec 2010 17:39:11 -0500

Well, I got the details wrong, but the example right.

If the multiple regression is:
svy: reg y x z

svy: reg y z,  with residual r_yz
svy: reg x z, with residual  r_xz

Thecorrelation of r_yz and r_xz is the partial correlation r_yx.z So the goal is to estimate the correlation of r_yz and r_xz with one of the methods in the post, including

svy: reg r_yz r_xz

Sorry for the confusion.

Steve
[email protected]

On Dec 7, 2010, at 5:03 PM, Steven Samuels wrote:

--


What Dawne called "correlation" is actually the partial correlation of y and x, controlling for other covariates z (r_yx.z). The partial correlation in OLS can be estimated by t^2/(df_error + t^2), but that is not true for survey regression. However it can be estimated in a three step process: -svy regress- y on x, with residual r_yx ; y on the z's with residual r_yz. Then compute the correlation of r_yx and r_yz by means of 1) Nick's program 2) the methods in Bill Sribney's FAQ; or 3) by running -svy: reg r_yx r_yz- and taking the R-squared reported by that command. The p-value issue discussed by Bill doesn't arise for Dawn, because she takes the p-value from the original regression. See below for an example.


Steve

**************************CODE BEGINS**************************
sysuse auto, clear
svyset rep78 [pw=trunk]

/* Compute partial correlation of mpg and weight, controlling for turn */
svy: reg mpg weight turn

svy: reg mpg turn
predict r_mt, resid

svy: reg weight turn
predict r_wt, resid

svy: reg r_mt r_wt   //R-squared is partial r-square
***************************CODE ENDS***************************


On Dec 7, 2010, at 4:18 PM, Nick Winter wrote:

Re: svy and (bivariate) correlation, this FAQ talks about how to do the equivalent of the nonexistent -svy: correlate-:

http://www.stata.com/support/faqs/stat/survey.html

The short version is that the point estimate is -correlate- with aweights, and the p-value as you discuss below.

My -corr_svy- from SSC implements this approach, though it is a Stata version 7 program so it does not take advantage of Stata's current, more extensive -svy- features.

-Nick Winter


On 12/7/2010 4:11 PM, Vogt, Dawne wrote:
Thanks. So it sounds like I can take the square root of the R squared value to get the correlation coefficient for a regression with 1 predictor. But how do I get effect size indicators (preferably in the form of correlation coefficients) for each predictor in a regression with multiple predictors?


-----Original Message-----
From: [email protected] [mailto:[email protected] ] On Behalf Of Steven Samuels
Sent: Tuesday, December 07, 2010 4:00 PM
To: [email protected]
Subject: Re: st: calculating effect sizes when using svy command

--
-
I should have added:  The  relation of (partial) r-squares to t-
statistics holds only for ordinary least squares, not for the
estimation formulas of survey regression. So, neither of your
calculated r's is correct.

Steve

On Dec 7, 2010, at 3:31 PM, Vogt, Dawne wrote:

I have two questions related to calculating effect sizes using svyreg
(pweights):

First, when doing unweighted regressions in SPSS, I like to provide
effect sizes for each predictor by calculating a correlation
coefficient value (r) from the t values provided in the output. I like
using r because it is easy for most people to interpret. Can I do the
same using svyreg output?

My second question is related to the first. Since there is no
correlation option under the svy commands, I have been computing
regressions of Y on X and X and Y and using the largest p value of the
two sets of results, as recommended elsewhere. I've having trouble
figuring out how to convert the results provided in the output to a
correlation coefficient though.  I noticed that the r value I get by
taking the square root of the R squared is different from my own hand
calculation of r derived from the t value provided in the regression
output [sqrt of (t squared divided by t squared + df). I'm not sure
which r is correct (or if either of them are correct).

Thanks in advance for any guidance others may be able to offer.


*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

--
--------------------------------------------------------------
Nicholas Winter                                 434.924.6994 t
Assistant Professor                             434.924.3359 f
Department of Politics                  [email protected] e
University of Virginia          faculty.virginia.edu/nwinter w
S385 Gibson Hall, South Lawn
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index