Absolute Agreement Vs Consistency

We need to understand that there are no default values for acceptable reliability with ICC. A low CPI could reflect not only the low degree of compliance of the evaluators or measures, but also the lack of variability between the sampled subjects, the small number of subjects, and the small number of reviewers to be tested.2, 20 As a general rule, researchers should try to obtain at least 30 heterogeneous samples and include at least 3 reviewers in conducting a reliability study. Under such conditions, we propose that ICC values below 0.5 indicate poor reliability, that values between 0.5 and 0.75 indicate moderate reliability, that values between 0.75 and 0.9 point to good reliability, and values above 0.90 to excellent reliability2 If distortions, that is. This statement, which applies to both Model 2 and Model 3, seems to be a little stronger than simply saying that ICC (C,1) is a measure of consistency. The simulations discussed below also demonstrate this. In fact, it was proposed by Alexander as early as 1947 [17]. ICC distributions (C,1) are insensitive to the presence of distortions; They remain the same, no matter how strong the bias is. With Eq (13) we find ρ2C = 102/(102 + 52) = 0.8 for the three ICC distributions (C,1). They coincide with both the ICC (1) (Zero Bias) distribution and the ICC (A,1) distribution when distortions are very low. This indicates that the confidence limits of ρ2C (the ICC consistency population of Model 2) are the same as the confidence limits of ρ1 (CPI population of Model 1, i.e. without bias). Thus, in fig.

Posted by | View Post | View Group

You can't leave comments on this post but you can leave a trackback here: