2022년 11월 12일 토요일

ch.19 Kappa statistics for agreement

easier R than SPSS with Rcmdr : Contents

ch.19 Kappa statistics for agreement


Kappa statistics analyzes the agreement of the two diagnostics.


Both of these diagnoses refer to those that come out positive/negative. If both diagnostics came out positive, if both came out negative, if only one was positive, and another was negative, a total of 4 values can be calculated by collecting the cases where only one came out negative. Enter its value in each cell.


The result shows the Kappa value and its 95% confidence interval. This is not judged by the value of p.

Interpreting these results is usually a common use of Landis and Koch’s interpretation (The Measurement of Observer Agreement for Categorical Data, 1977).

These are 0 to 0.2: Slight , 0.2 to 0.4: Fair, 0.4 to 0.6: Moderate, 0.6 to 0.8: Substantial, and 0.8 to 1: Almost perfect.


To calculate s ensitivity, specificity (specificity), or Kappa values, you need tabulated data. This menu makes it very easy.


First, select two norminal variables.


Next, select a few options:


This is done in a tabular format, which is called a ‘frequency table’ or a ‘contigency table’, just called atable’.

Next Part is for Analysis of paired data

 



easier R than SPSS with Rcmdr : Contents

=================================================

  • R data visualization book 2
https://tinyurl.com/R-plot-II-2  simple variables
https://tinyurl.com/R-plot-II-3-4   many variables / map
https://tinyurl.com/R-plot-II-5-6   time related / statistics related
https://tinyurl.com/R-plot-II-7-8   others / reactive chart 
 

 

댓글 없음:

댓글 쓰기