lileipisces / PEPLER

TOIS'23, Personalized Prompt Learning for Explainable Recommendation
124 stars 20 forks source link

Question about continuous prompt initialization #2

Open menglin0320 opened 1 year ago

menglin0320 commented 1 year ago

I wonder if you guys tried to simply average the embeddings for the title to initialize the continuous prompts. I feel that it can be a simpler solution to the problem you guys mentioned in paper. I want to try something with the ideas in the paper but two stage training kind of scares me away(I only have limited time for a project.) Do you guys think "average the embeddings for the title to initialize the continuous prompts" is a valid idea?

lileipisces commented 1 year ago

You can give it a shot. I think titles that consist of words are more compatible with the model than random embeddings. This could make the model converge faster and shorten the training time.