by Shikai Fang, Akil Narayan, Robert M. Kirby, Shandian Zhe
This authors' official PyTorch implementation for : Bayesian Continuous-Time Tucker Decomposition (ICML 2022 Oral paper)
See more material(slides,poster) at my page
As shown in the figures, we assign the tucker core with Temporal-Gaussian Process priors to handle the continuous-time-varing dynamics in the tensor data.
The Temporal-Gaussian Process with stationaty kernel is equivalent to linear time-invariant stochastic differential equations (LTI-SDE), which we can slove by Kalman-Filter & smoothing sequentialy. We further apply conditional moment-match to update the latent factors. The Inference Algorithm is with linear cost of time-steps.
The project is mainly built with pytorch 1.10.1 under python 3. Besides that, make sure to install tqdm and tensorly before running the project.
code_fang\notebook
(on synthetic & real data)code_fang\script_BCTT.sh
Check our paper for more details.
Please cite our work if you would like it
@inproceedings{fang2022bayesian,
title={Bayesian Continuous-Time Tucker Decomposition},
author={Fang, Shikai and Narayan, Akil and Kirby, Robert and Zhe, Shandian},
booktitle={International Conference on Machine Learning},
pages={6235--6245},
year={2022},
organization={PMLR}
}