Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: Cohen's Kappa or other interrater agreement statistics


From   "JVerkuilen (Gmail)" <jvverkuilen@gmail.com>
To   statalist@hsphsun2.harvard.edu
Subject   Re: st: Cohen's Kappa or other interrater agreement statistics
Date   Fri, 11 Jan 2013 12:20:47 -0500

On Fri, Jan 11, 2013 at 12:04 PM, Ilian, Henry (ACS)
<Henry.Ilian@dfa.state.ny.us> wrote:

> I'm not entirely happy about this approach because the content for the items may not be similar enough. Also, it
> doesn't allow me to identify items that are troublesome.
>
> Can anyone see a better way of testing for interrater agreement with this type of data?

It's possible to do a lot of agreement modeling in a loglinear
approach. Alan Agresti wrote several articles on the topic, as well as
John Uebersax. Check out Agresti's Categorical Data Analysis, 2nd
Edition, Wiley, Chapter 10, for citations, or
http://john-uebersax.com/stat/agree.htm.
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/faqs/resources/statalist-faq/
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index