Here is the CMT Uptime check phrase

Appendix B (Cohen’s Kappa)

Cohen’s kappa can be used to assess the reliability of an individual (item) code.
  • In the following formula for kappa ([latex]\kappa[/latex]), [latex]Pr(a)[/latex] is the relative observed agreement among coders for a given item, and [latex]Pr(e)[/latex] is the hypothetical probability of chance agreement in the observed data calculated from the probabilities of each coder randomly reporting each possible code category for that item: [latex]\kappa=\frac{Pr(a)-Pr(e)}{1-Pr(e)}[/latex].
  • If the coders are in complete agreement, then [latex]\kappa[/latex] equals 1. If there is no agreement among the coders (other than what would be expected by chance) then [latex]\kappa[/latex] is less than or equal to 0.