Closed LeiShenVictoria closed 2 months ago
Hey @LeiShenVictoria
So I have not read the Deep Interest Network paper, I will, maybe I can incorporate some ideas to the library.
As of right now, the only thing "kind-of" similar you would have here are the attention weights of the models.
All model components that are based on attention mechanisms have an attribute called attention_weights
: see here
I will have a look to the paper Deep Interest Network paper asap and see if I can come up with a quick answer that is more helpful :)
Hi, thanks for your reply. One more question is that how to implement the embedding-sharing operation for a candidate feature and a sequence feature.
Hey @LeiShenVictoria
I would have to read the paper :)
I am busy at work now, but ill see what I can do asap
Hi @LeiShenVictoria
There is a brach called ffm now where DIN is implemented
In the examples folder there is a script called movielens_din.py
with an example 🙂.
I know the issue was opened a while ago, but took me time to find the time.
Let me know if you have any questions
Covered in PR #234
In DeepInterestNetwork, there is a target attention between a candidate feature (one column) and a sequence feature, how to implement this target attention in this repo, which can be considered as an attention between a column in deep part (candidate) and the text part (sequence) i guess... Thanks a lot