recommenders-team / recommenders

Best Practices on Recommendation Systems
https://recommenders-team.github.io/recommenders/intro.html
MIT License
18.47k stars 3.04k forks source link

[ASK] Perfect MAP@k is less than 1 #2091

Closed daviddavo closed 2 months ago

daviddavo commented 2 months ago

Description

I have a recommender that, for some users in some folds, has less than $k$ items in the ground truth. Therefore, the $precision@k$ is less than 1, even with a recommender that recommends the ground truth. For that reason, I calculate the results of a perfect recommender for multiple metrics.

By definition, the perfect $ndcg@k$ is 1. I thought this was the case for $MAP@k$ too, but it is not, the average $MAP@5$ of various folds of mine is 0.99, but I even have a fold with a $MAP@5$ of 0.7! I've also noticed that perfect $MAP@k$ is exactly equal to $recall@k$, but I haven't found any resources that explain this coincidence.

Keep in mind that I'm talking about implicit feedback, and the ideal recommender just assigns 1 in the prediction field.

Other Comments

I'll try and provide an example that causes this "issue".

daviddavo commented 2 months ago

For those that came here from Google.

I think it is easier to understand if we first explain recall at k:

If the number of items is greater than k, then the recall can never reach one, not even with a recommender that knows the test set. Let's say you have only one user that has 12 interactions with items. With k=5 the maximum recall is 5/12, as you will only get 5 recommendations.

The MAP uses the precision-recall curve, and in a perfect recommender is just equivalent to the number of recovered elements, which is the recall. Therefore, the maximum MAP achievable is equivalent to the maximum recall achievable, which probably is not 1.