Interobserver agreement issues in radiology - Centre de Recherche en Économie et Statistique Accéder directement au contenu
Article Dans Une Revue Diagnostic and Interventional Imaging Année : 2020

Interobserver agreement issues in radiology

Résumé

Agreement between observers (i.e., inter-rater agreement) can be quantified with various criteria but their appropriate selections are critical. When the measure is qualitative (nominal or ordinal), the proportion of agreement or the kappa coefficient should be used to evaluate inter-rater consistency (i.e., inter-rater reliability). The kappa coefficient is more meaningful that the raw percentage of agreement, because the latter does not account for agreements due to chance alone. When the measures are quantitative, the intraclass correlation coefficient (ICC) should be used to assess agreement but this should be done with care because there are different ICCs so that it is important to describe the model and type of ICC being used. The Bland-Altman method can be used to assess consistency and conformity but its use should be restricted to comparison of two raters. (C) 2020 Societe francaise de radiologie. Published by Elsevier Masson SAS. All rights reserved.
Fichier principal
Vignette du fichier
S2211568420302175.pdf (135.93 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03266816 , version 1 (17-10-2022)

Licence

Paternité - Pas d'utilisation commerciale

Identifiants

Citer

Mehdi Benchoufi, E. Matzner-Lober, Nicolas Molinari, A.-S. Jannot, P. Soyer. Interobserver agreement issues in radiology. Diagnostic and Interventional Imaging, 2020, 101 (10), pp.639-641. ⟨10.1016/j.diii.2020.09.001⟩. ⟨hal-03266816⟩
82 Consultations
152 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More