NVIDIA-Merlin / Transformers4Rec

Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation and works with PyTorch.
https://nvidia-merlin.github.io/Transformers4Rec/main
Apache License 2.0
1.07k stars 142 forks source link

Users' Long term and short term interests #730

Open NamartaVij opened 1 year ago

NamartaVij commented 1 year ago

I am trying to add users' stable long-term interests and dynamic short-term interests for next action prediction.May I know the documentation to train the two Xlnet , one transformer to capture the historical behaviour and other to extract short term( current interests according to timestamp).

Is it possible to create a custom architecture to train two transformer towers together with different inputs and then have concat at later point..?

vivpra89 commented 1 year ago

@NamartaVij any leads on how to get this two tower model setup using the Transformers4rec api ?

NamartaVij commented 1 year ago

@vivpra89 you mentioned about two sequences. If you want to divide a sequence into two separate sequences, one for long-term (historical) and another for short-term (current) data, May I know how to do this? There are so many sources available in this architecture, but I am asking while working with transformer4rec

vivpra89 commented 1 year ago

@NamartaVij Not yet successful in getting the architecture right. Not sure how to connect two tabularseuquences using concat and then pass to body + prediction head. do you have any code / idea on how to do that?

as far as sequences goes: Long term = last 12 month data, short term = in-session data.

NamartaVij commented 1 year ago

@vivpra89 I will share the code soon, getting few errors.

Long-term sequence (historical data)

train_long_term = data.iloc[:500_000]

Short-term sequence (recent data)

train_short_term = data.iloc[500_000:600_000] So firstly you created the schema for both sequence and then giving two sequences separately to two tabularsequences and training separately and then combine in the end? this is your idea?

However, What is the issue occurring while concatenation , there we several methods to combine, which transformers4rec supports

vivpra89 commented 1 year ago

Two cases: 1. Train Separately : Train one Transformer first and then use the embedding as pretrained embedding input into second tabular sequence

  1. Train together : training both long and short term together (like the two tower model : here one tower is short term and another is long term) - not sure how to solve this

I was thinking of this :https://github.com/NVIDIA-Merlin/Transformers4Rec/issues/725#issuecomment-1623128957 : but this doesnt seem right..

NamartaVij commented 1 year ago

@vivpra89 I m getting error . could you please give your suggestions on this. please check #735

russelnelson commented 11 months ago

I am trying to add users' stable long-term interests and dynamic short-term interests for next action prediction.May I know the documentation to train the two Xlnet , one transformer to capture the historical behaviour and other to extract short term( current interests according to timestamp).

Is it possible to create a custom architecture to train two transformer towers together with different inputs and then have concat at later point..?

Have you looked into sub-interest learning as a possible solution? Approaching the problem by thinking of users as having many sub-interests, only one of which is active at a time. For instance, a YouTube user has three sub-interests: DIY home repair videos, concerts, and news programs. The home repair sub-interest is active rarely, when something is broken in their house, the concerts sub-interest is only active on weekend nights, etc. So embeddings of users, items, and sub-interests, along with the users positive (watched) and negative (skipped) feedback.