Cohen's kappa, or Cohen's Kappa coefficient, is a quantitative measure of agreement of categorical variables between two raters (inter-rater reliability) or one rater at two time periods (intra-rater reliability). A Cohen's kappa of 0 indicates agreement equivalent to chance. A Cohen's kappa of 1 indicates total agreement.