InterannotatorAgreement
InterAnnotatorAgreement (IAA) refers to the degree of agreement among independent annotators when labeling data for a task. IAA is a standard measure of reliability and data quality in annotated corpora, surveys, or image datasets, helping determine whether a labeling scheme and guidelines yield consistent results and whether the data are suitable for downstream use. It is distinct from intra-annotator agreement, which assesses an individual’s consistency over time.
Common metrics used to quantify IAA include percent agreement, Cohen’s kappa for two annotators on categorical
Calculation typically involves selecting a representative sample of items, computing the chosen statistics, and reporting uncertainty
IAA informs dataset quality, guides annotation protocol design, and influences model training and evaluation across fields