noveens / distill_cf

[ NeurIPS '22 ] Data distillation for recommender systems. Shows equivalent performance with 2-3 orders less data.
MIT License
22 stars 2 forks source link

Very good work #3

Closed Coder-Yu closed 1 year ago

Coder-Yu commented 1 year ago

Hi Noveens,

DistillCF is a great work. I am also working on the data distillation these days. I came across your paper a few months ago but did not read it. Recently I have been developing my model and then I read it today. Can't believe the technical details in your paper are almost as the same as my design (using multi-round gumbel-softmax to generate discrete values and L1 regularization to meet the budget). Even the experiments and the emphasis on data-centric methods in your paper are similar to my plan. What a coincidence! ๐Ÿ˜†

It looks like I have to re-design my model from some new perspectives. Thank you for your contributions on data distillation. I also like the infinite-AE part๐Ÿ‘

noveens commented 1 year ago

Sent you a DM! Thanks for your nice comment :)