Open wanghao9610 opened 3 years ago
Lol, he already provided us with a neat and clean implementation of the arch! you can check a train code here using this repo's implementation of the Transformer
It is just a toy training code, not including any training details, e.g. data preparation, model optimizer, init_lr, and so on. Waiting for your code release!
@wanghao9610 Have you found the relevant code?
You can find the official implementation of the paper in this repo: https://github.com/facebookresearch/TimeSformer
Could you leave the code of training and evaluation? Thanks a lot.
Have you found the traning code yet? I am also in need for a training code.
Could you leave the code of training and evaluation? Thanks a lot.