首页 | 本学科首页   官方微博 | 高级检索  
     


Agreement between Two Independent Groups of Raters
Authors:Sophie Vanbelle  Adelin Albert
Affiliation:1.Medical Informatics and Biostatistics,University of Liege,Liege,Belgium
Abstract:We propose a coefficient of agreement to assess the degree of concordance between two independent groups of raters classifying items on a nominal scale. This coefficient, defined on a population-based model, extends the classical Cohen’s kappa coefficient for quantifying agreement between two raters. Weighted and intraclass versions of the coefficient are also given and their sampling variance is determined by the Jackknife method. The method is illustrated on medical education data which motivated the research.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号