recommenders / rival

RiVal recommender system evaluation toolkit
rival.recommenders.net
Apache License 2.0
150 stars 40 forks source link

Implement non-accuracy metrics #58

Closed abellogin closed 9 years ago

abellogin commented 10 years ago

Like novelty, diversity, etc. We should try to involve someone else that already measured them (e.g., Saúl Vargas @saulvargas)

Milestone: later than 0.5?

alansaid commented 10 years ago

If @saulvargas has the time to spend on this, it would be great! :+1:

alansaid commented 10 years ago

I think we should strive to have some basic metrics, e.g. ILS earlier than 0.5. Definitively no later than 0.5 at least.

saulvargas commented 10 years ago

Hi! I'd be happy to contribute. Implementing the basic version of some common*\ diversity and novelty metrics is easy and straightforward. I propose to implement the following:

alansaid commented 10 years ago

@saulvargas awesome! :+1: Please do the following:

  1. fork the project
  2. do your magic
  3. submit pull requests
  4. go to 2
abellogin commented 10 years ago

Btw @saulvargas, if during this process you have any suggestion to improve the evaluation module, please, let us know and we can discuss about it (a comment here should be enough).

alansaid commented 10 years ago

@saulvargas any progress here? Can we (me and/or @abellogin) do anything to help you?

saulvargas commented 10 years ago

Done! I implemented the basic version of some novelty and diversity metrics, that is, without relevance or rank discount. If you were interested in these functionalities, then we would have to talk...

alansaid commented 10 years ago

:+1:

alansaid commented 10 years ago

Current iteration closed in 47b2347afe9e13c8aa785b5feaa3831c70ff608f

saulvargas commented 10 years ago

I would like to release an implementation of the "binomial diversity" metric of the paper in RecSys 2014 (http://ir.ii.uam.es/saul/pubs/recsys2014-vargas-tid.pdf). Do you think it could be included in RiVal?

alansaid commented 10 years ago

Sure, go for it. I think that for more esoteric/non standard metrics it would make sense to update the documentation/wiki Evaluation Metrics

alansaid commented 10 years ago

I suggest you open an issue with the metric you want to implement and assign it to yourself. If you have timeline for when you think you might have it done, assign it to one of the milestones that fits your time estimate

abellogin commented 10 years ago

Awesome! Besides, this would give more publicity to the framework, which is great ;)