Cohen’s kappa
(symbol: κ) a numerical index that reflects the degree of agreement between two raters or rating systems classifying data into mutually exclusive categories, corrected for the level of agreement expected by chance alone. Values range from 0 (no agreement) to 1 (perfect agreement), with kappas below .40 generally considered poor, .40 to .75 considered fair to good, and more than .75 considered excellent. In accounting for chance, Cohen’s kappa avoids overestimating the true level of agreement as might occur through simply determining the number of times that two raters agree relative to the total number of ratings. [Jacob Cohen]