Skip to main content

Cohen's Kappa

Encyclopedia
Edited by: Published: 2010
+- LessMore information
Download PDF

Cohen's Kappa coefficient (κ) is a statistical measure of the degree of agreement or concordance between two independent raters that takes into account the possibility that agreement could occur by chance alone.

Like other measures of interrater agreement, κ is used to assess the reliability of different raters or measurement methods by quantifying their consistency in placing individuals or items in two or more mutually exclusive categories. For instance, in a study of developmental delay, two pediatricians may independently assess a group of toddlers and classify them with respect to their language development into either “delayed for age” or “not delayed.” One important aspect of the utility of this classification is the presence of good agreement between the two raters. Agreement between two raters could be ...

Looks like you do not have access to this content.

Reader's Guide

  • All
  • A
  • B
  • C
  • D
  • E
  • F
  • G
  • H
  • I
  • J
  • K
  • L
  • M
  • N
  • O
  • P
  • Q
  • R
  • S
  • T
  • U
  • V
  • W
  • X
  • Y
  • Z

      Copy and paste the following HTML into your website