benhamner / Metrics

Machine learning evaluation metrics, implemented in Python, R, Haskell, and MATLAB / Octave
Other
1.63k stars 454 forks source link

Fix average precision at k calculation #54

Open raminqaf opened 3 years ago

raminqaf commented 3 years ago

This PR fixes #49 According to the Wikipedia page of Average Precision the equation is defined as follow: image where rel(k) is an indicator function equaling 1 if the item at rank k is a relevant document, zero otherwise. Note that the average is over all relevant documents, and the relevant documents not retrieved get a precision score of zero. Before, the average was calculated over the minimum value between the length of the actual value and k. This doesn't seem right since the length of the actual list of k increases; the AP@K will decrease. I fixed and cleaned up the code. Please consider merging this! This could lead to many mistakes.