Here is the CMT Uptime check phrase

Appendix B (Cohen’s Kappa)

Cohen’s kappa can be used to assess the reliability of an individual (item) code.

  • In the following formula for kappa (\(\kappa\)), \(Pr(a)\) is the relative observed agreement among coders for a given item, and \(Pr(e)\) is the hypothetical probability of chance agreement in the observed data calculated from the probabilities of each coder randomly reporting each possible code category for that item: \(\kappa=\frac{Pr(a)-Pr(e)}{1-Pr(e)}\).
  • If the coders are in complete agreement, then \(\kappa\) equals 1. If there is no agreement among the coders (other than what would be expected by chance) then \(\kappa\) is less than or equal to 0.