[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: kappa and specific agreement for unbalanced sample

From   Ann Montgomery <>
Subject   st: kappa and specific agreement for unbalanced sample
Date   Fri, 12 Feb 2010 11:25:02 -0500

I am comparing the inter-rater reliability between two raters who have coded results and I now have either 2x2 (+/-; a b /c d) tables or 3x3 (+/ - & unknown) for each variable coded.

I've calculated the kappa and it's confidence interval (i'm familar with kappa and not so much with Finn's r). In other papers, I have noted that some authors present kappa, sensitivity, specificity (assuming one rater as the gold standard), and specifc agreement (e.g. positive specific agreement = 2a/(a+b+c)) since kappa is sensitive to prevalence if positive is very low (where a<<d) - presenting all of these results allows the reader to determine the reliability of the coders quality and consistency from various statisitical viewpoints.

Q. In STATA v9.2, how do i calculate positive and negative specific agreement for the 2x2 and 3x3 tables?

Thanks in advance, Ann Montgomery in Toronto
*   For searches and help try:

© Copyright 1996–2020 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index