weighted kappa

weighted kappa

an index of interrater agreement that takes into account the degree of disparity between the categorizations assigned by different observers. Thus, different levels of agreement contribute more or less heavily to the overall value of kappa than others. For example, if two raters differ by two categories, that difference is assigned more importance (i.e., given a greater weight) in the analysis than if they differ only by one category. See also Cohen’s kappa.