Closed KlausGlueckert closed 1 year ago
Hello @KlausGlueckert
Thank you for submitting this issue to us. I hope I understood and answered your request correctly. Feel free to give me more details, like examples!
The problem you want to solve is a ranking-based recommendation problem, also called "learning to rank" (as in the article Recommendation Systems with Distribution-Free Reliability Guarantees). Here, Catboost Ranking is a model that you have chosen among others.
In conformal prediction, you expect based on "a pre-trained ranking model […] to return a set of items that is rigorously guaranteed to contain mostly good items."
So your request is how can MAPIE give you a conformal prediction set based on ranking models or on ranking scores, isn't it?
I understand that you are studying this paper and want to implement their proposal. The authors propose to apply the "learn then test" framework for calibration in order to propose conformal prediction for "learning to rank" problems.
I confirm that we plan to work on the "learn then test" framework for calibration (for regression, then for classification) in a future issue.
We will close this issue as it is not planned for our roadmap this 2023 year.
Is your feature request related to a problem? Please describe. I am using Catboost Ranking
Describe the solution you'd like Can you add a module to build conformity intervalls for catboost ranks
Describe alternatives you've considered Studying and implementing this paper: Recommendation Systems with Distribution-Free Reliability Guarantees