RUCAIBox / RecBole

A unified, comprehensive and efficient recommendation library
https://recbole.io/
MIT License
3.27k stars 590 forks source link

[🐛BUG] SINE attention weighting implementation #1991

Open Elvenson opened 5 months ago

Elvenson commented 5 months ago

Description:

Based on the SINE paper, in the attention weighting part, they added the trainable positional embeddings to the input embeddings so that the model can use item position to calculate Pt|k. But in RecBole SINE attention weighting implementation, I saw that you still use the original input for calculation.

Here is the detail in the paper for reference:

Screenshot 2024-02-08 at 5 19 25 PM
Fotiligner commented 4 months ago

@Elvenson Thank you for your advise to RecBole. We will revise the code according to the paper, test it on the datasets and update it soon.