The Kappa coefficient is a statistical measure of inter-rater agreement for qualitative (categorical) items. Measure developers can think of Cohen’s kappa as a chance-corrected proportional agreement. Possible values range from +1 (perfect agreement), 0 (no agreement above that expected by chance) to -1 (complete disagreement).