Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down at the end of May, and its replacement, statalist.org is already up and running.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: how to test significance for difference


From   Austin Nichols <austinnichols@gmail.com>
To   statalist@hsphsun2.harvard.edu
Subject   Re: st: how to test significance for difference
Date   Thu, 25 Apr 2013 10:52:49 -0400

Isabella King <isabella.c.king@gmail.com>:

I can't see that anyone has answered this, but perhaps that is because
the question is both too simple and too complex. Simply put, you use
-test- (with a help file with the example "Test coefficient on
2.region=coefficient on 4.region" and a very good manual entry) to
compare coefficients, proportions, counts, etc.

The more complex part is that there are implicit assumptions about the
distribution of your variables that determine the desirability of
various tests; if you say you want to compare the number of new firms
in two areas, presumably you actually want to compare the birth rate
of new firms. This suggests comparing the proportion of firms in each
area that are new. Perhaps you will want to condition on other
important factors (which would imply having panel data)?.

If you just want to compare two proportions, you can -svyset- (with
survey characteristics if you have complex survey data or with
-svyset,srs- if you do not) and then -svy:tab area new, row ci- and
then compare proportions with -test-. This has been discussed many
times on the list e.g. at
 http://www.stata.com/statalist/archive/2009-05/msg00031.html

Otherwise, you want a regression, and the type of regression depends
on your data, but might use -glm- with a logit link, as in e.g.
 http://www.stata.com/statalist/archive/2008-12/msg01013.html

If you are running a regression with two dummy (indicator) variables
representing two groups, you can compare with the -test- command
again, see e.g.
 http://www.stata.com/support/faqs/statistics/compare-levels-of-categorical-variable/
but often you can simply redefine your dummy (indicator) variables or
change which you are including so that the relevant test appears in
the regression output directly.

If you actually want to compare counts of firms, you probably want a
-poisson- regression or -glm- with a log link, possibly on a constant
and a single indicator for "area B" membership, with robust standard
errors (at minimum).

On Wed, Apr 24, 2013 at 5:49 AM, Isabella King
<isabella.c.king@gmail.com> wrote:
> Dear all,
>
> I use -tabulate- to check the distribution of two variables. But I am
> interested in the difference. Which code is used to directly give the
> difference in value and test the significance? For example, group 1: number
> of new firms in area A; group 2: number of new firms in area B. I get the
> result and knows the difference. But I do not know whether the difference is
> statistically significant. How to test this significance and report the
> t-value?
>
> Another question: When I run regression for two groups, I get two
> coefficients, I know the difference, but how to test the significance for
> this difference and report the t-value for this difference value?
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/faqs/resources/statalist-faq/
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index