hwwang55 / KGCN

A tensorflow implementation of Knowledge Graph Convolutional Networks
MIT License
479 stars 150 forks source link

Is KGCN for Recommender System is Inductive in Nature ? #30

Open sachinsharma9780 opened 2 years ago

sachinsharma9780 commented 2 years ago

Hi,

I am going through the literature of the paper and one thing which I find missing is information about inductiveness of the proposed algorithm.

So my question is, Is the proposed architecture inductive in nature i.e. it can generalise to new users as well without retraining?

Thanks Sachin

hwwang55 commented 2 years ago

Hi Sachin,

Thanks for your interest in our work! Our method is item-inductive but not user-inductive, because we have an item KG which can help us calculate the representation of a new item, but we do not have such KG at the user end.

sachinsharma9780 commented 2 years ago

Thank you for the response @hwwang55 .

So if I add a new user (u1) in the interaction matrix (Y), lets say with some engagements and now we want to find out if the new user (u1) will engages with the movie lets say "Titanic" and the titanic movie is already there in the KG then in this case can't we generate user specific (u1) movie (titanic) embeddings?

I am just thinking under the perspective if we want to build out movie recommendation application using your proposed algorithm where we somehow recommend movies to new users without retraining the KGCN algo.

hwwang55 commented 2 years ago

It depends on how you design user embeddings. If user embeddings are randomly initialized embedding vectors, you cannot deal with the cold start problem. If user embeddings are based on user features, e.g., output by an MLP that takes users' initial feature as input, then you can do the inference without re-training the model. Thanks!

sachinsharma9780 commented 2 years ago

I am going through your other paper "Knowledge-aware Graph Neural Networks with Label Smoothness Regularization for Recommender Systems" which claims that Label Smoothness adds inductive bias in algorithm.

Can this algorithm generalise to new users without retraining?

hwwang55 commented 2 years ago

You still need an MLP to calculate user embeddings in KGNN-LS

sachinsharma9780 commented 2 years ago

Thanks for clarifying. So in the paper user embeddings are are randomly initialized embedding vectors, isn't it ?

hwwang55 commented 2 years ago

Correct. You can of course calculate user embeddings using their initial features if available.

sachinsharma9780 commented 2 years ago

Just a question out of curiosity;

So if the user features are available (e.g. demographics, sex, etc) then we can create user embeddings via MLP. Afterwards how we can use these embeddings to generate recommendations for a new user ?

sachinsharma9780 commented 2 years ago

The another main difficulty is to find a standard dataset which provides the information about user feature like demographics, etc. However, I dont think user information is provided by any standard dataset.

hwwang55 commented 2 years ago

Once you have user embedding, you can use it to calculate the user-specific adjacency matrix, then running GCN on this adjacency matrix. Item embeddings are contained in the output of the GCN. Finally you can predict user engagement labels using user and item embeddings.

sachinsharma9780 commented 2 years ago

Once you have user embedding, you can use it to calculate the user-specific adjacency matrix, then running GCN on this adjacency matrix. Item embeddings are contained in the output of the GCN. Finally you can predict user engagement labels using user and item embeddings.

In this case our model needs to be trained on user specific side information?

rituk commented 2 years ago

@sachinsharma9780 are you able to create the kg for a new dataset? Can you list steps, if you don't mind?