claws-lab / jodie

A PyTorch implementation of ACM SIGKDD 2019 paper "Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks"
MIT License
355 stars 82 forks source link

update the embeddings from last time point to the first time point #4

Open LeiBAI opened 5 years ago

LeiBAI commented 5 years ago

Hello,

Thanks for the work and codes which I enjoy a lot. I have a question about updating the user/item embeddings. I noticed that the user/item embeddings are global variables. While updating them among one epoch can reflect the temporal dynamics, how to explain updating the embeddings again and again. To explain my concern: Let us define the whole time period in the dataset as [0, t1, t2, t3, ..., T], we will need to update the user/item embeddings following the time sequence from 0 to t1, ..., to T. However, we normally need to repeat the process for multiple epochs, which normally means we have to restart from T to 0 again. Will this cause any training problems?

Thanks and best regards

Lei

SungMinCho commented 4 years ago

Hello. I would also like to know about this. What are the intuitions behind running multiple epochs over the same time period? What is changing (or rather, "persisting") over the course of repetition (RNN weights and such?)? Thanks!!

srijankr commented 4 years ago

@SungMinCho It is used to learn model weights and embeddings.