Graph-Machine-Learning-Group / spin

Official repository for the paper "Learning to Reconstruct Missing Data from Spatiotemporal Graphs with Sparse Observations" (NeurIPS 2022)
https://arxiv.org/abs/2205.13479
MIT License
48 stars 8 forks source link

Where can I find the temporal encoding module? #2

Closed sgdy3 closed 1 year ago

sgdy3 commented 1 year ago

Hi, I've noticed that you have mentioned "we use sine and cosine transforms of the time step t w.r.t. a period of interest (e.g., day and/or week), to account for". However, I can't find corresponding code either in "positional_encodind" nor "utils.py". It this module not included in the repo and belonging to the dataset preprocessing part, or just because I miss some part of the repo. And I'm not very clear about how this transform works. Does it operate like positional encoding in “Attention is all you need" ?

marshka commented 1 year ago

Hi,

the function you refer to is at this line in the script:

https://github.com/Graph-Machine-Learning-Group/spin/blob/2320695ff03b23606e73b05ac87f3ddff9d74c0c/experiments/run_imputation.py#L170

which calls a method inherited from the base PandasDataset class in the library torch-spatiotemporal:

https://github.com/TorchSpatiotemporal/tsl/blob/main/tsl/datasets/prototypes/mixin.py#L115-L129

This tensorflow tutorial explains well what is done in that function.

— Ivan

sgdy3 commented 1 year ago

Thanks for your help, I understand now.