benfred / implicit

Fast Python Collaborative Filtering for Implicit Feedback Datasets
https://benfred.github.io/implicit/
MIT License
3.57k stars 612 forks source link

Can't use 'explain' method in class 'AlternatingLeastSquares' #646

Open singsinghai opened 1 year ago

singsinghai commented 1 year ago

I found in implicit/cpu/als.py the code : def explain(self, userid, user_items, itemid, user_weights=None, N=10) But when I tried to use it I got error below: AttributeError: 'AlternatingLeastSquares' object has no attribute 'explain' I have pip installed the latest version of implicit but still does not work out. Can you help me clarify if I'm missing something?

benfred commented 1 year ago

Are you using the GPU model (like does model.__class__ show implicit.gpu.als.AlternatingLeastSquares? The GPU code doesn't have this method implemented - but you can convert to a CPU model with model.to_cpu() - and then call explain on that.

singsinghai commented 1 year ago

Are you using the GPU model (like does model.__class__ show implicit.gpu.als.AlternatingLeastSquares? The GPU code doesn't have this method implemented - but you can convert to a CPU model with model.to_cpu() - and then call explain on that.

Dear benfred,

Thanks for the explanation, it works for me. Can I ask a further question: The top_contributions is "a list of the top N (itemid, score) contributions for this user/item pair", but what score does it base on? Is it the initial event_strength of the sparse matrix we passed in for training, or is it the matrix after we have filled in using co-similarity scores?

dminovski0 commented 1 year ago

Can the explain method be used to explain similar users, and why it recommended some users to a specific user?

essefi-ahlem commented 1 year ago

I have another question @benfred @ita9naiwa : when recommending items for this user id=1, I get a score of 1.35 for item 8708

image

when i want to explain why did i get item 8708 recommended for user 1, the first parameter of explain is supposed to be "The total predicted score for this user/item pair", I thought it needs to be equal to what I got from recommend which we found it equal to 1.35, but here it is equal to another score = 0.56

image

So my question what is the difference between the score given in recommend and the score in explain?