dkpro / dkpro-statistics

DKPro Statistics
https://dkpro.github.io/dkpro-statistics
Apache License 2.0
12 stars 4 forks source link

Calculating Kappa agreement with perfect expected agreement #1

Closed chmeyer closed 9 years ago

chmeyer commented 9 years ago

Originally reported on Google Code with ID 1

Hello,
I am using DKPro Statistics version 1.0.0.

When I run TwoRaterKappaAgreement with a single item on which both annotators agree,
I expect the Kappa agreement to be 1.0. As the following test case demonstrates, this
is not the case:

    final AnnotationStudy study = new AnnotationStudy(2);
    study.addItem("c", "c");

    final double expectedKappaAgreement = 1.0;
    final double actualKappaAgreement = new TwoRaterKappaAgreement(study).calculateAgreement();
    Assert.assertEquals(expectedKappaAgreement, actualKappaAgreement, 1.0);

The result is:

    AssertionFailedError (expected:<1.0> but was:<NaN>)

I am not an expert in IAA measures; is Kappa defined for one item?

Looking into the code, the problem seems to be that observed and expected agreement
are both 1.0, which leads to a division by zero.

Regards,
Roland

Reported by roland.kluge.de on 2013-12-06 10:06:26

chmeyer commented 9 years ago
If only one annotation item and one category is present, the expected agreement is 1
(since there is only one category, the two raters basically have no choice at all.
Thus, Cohen's kappa calculates to:

kappa = (A_O - A_E) / (1 - A_E) = (1 - 1) / (1 - 1) = 0 / 0 = NaN

Returning 1 in this case would be misleading IMHO.

(BTW: sorry for late response, e-mail notification was turned off :-/).

Reported by chmeyer.de on 2014-07-25 13:51:14