Cohen’s Kappa Calculator
Introduction & Importance
Cohen’s Kappa is a statistical measure that quantifies inter-rater reliability for qualitative (categorical) items. It’s crucial for evaluating the reliability of content moderation, survey data, and more. Our calculator simplifies manual calculations.
How to Use This Calculator
- Enter the observed agreement (P_o) and expected agreement (P_e) values.
- Click ‘Calculate’.
- View the results and chart below.
Formula & Methodology
Cohen’s Kappa (κ) is calculated as:
κ = (P_o – P_e) / (1 – P_e)
Where:
- P_o is the observed agreement,
- P_e is the expected agreement by chance,
- κ is Cohen’s Kappa.