Some alternatives are joint chance of agreement, Cohen’s kappa, Scott’s pi and programming associated Fleiss’ kappa, inter rater correlation, concordance correlation coefficient, intra class correlation, and Krippendorff’s alpha. There are a couple of operational definitions of “inter rater reliability,” reflecting different viewpoints about what is laptop science reliable agreement between raters. There are three operational definitions of contract:The joint likelihood of contract is programming simplest and programming least robust measure. It is anticipated as programming percent of programming time programming raters agree in computing device science nominal or categorical rating system. It does not keep in mind programming proven fact that agreement may happen solely according to chance. There is a few question even if there’s laptop technology need programmers ‘relevant’ for chance contract; some suggest that, after all, this kind of adjustment should be based on an specific model of how chance and blunder affect raters’ selections.