ajabri / videowalk

Repository for "Space-Time Correspondence as a Contrastive Random Walk" (NeurIPS 2020)
http://ajabri.github.io/videowalk
MIT License
266 stars 38 forks source link

Test time training code #15

Open AndyTang15 opened 3 years ago

AndyTang15 commented 3 years ago

Hi Allan,

Many thanks for releasing the codes again! Could you tell us the time that you plan to release the test-time training code? Or would it be possible for you to give me some suggestions on how to implement this based on current codebase?

Many thanks!

fatemehazimi990 commented 3 years ago

Hi @ajabri , @AndyTang15 ,

I am also very interested to know more about this. I tried a couple of experiments following the instructions in the paper, but the results were never better than without test time adaptation. So I add a couple of questions/observations here:

  1. I was wondering if the same data augmentation as during training is used for test time adaptation.
  2. when choosing a window from t-10 to t+10, is the whole window used for the walk (walk length 40) or shorter sequences are sampled from that?
  3. In this scenario, was the model able to overfit properly such that loss reaches close to 0 and the acc for node classification ~ 100%? If yes, this would mean that even if the model masters at the self-supervision task, it only improves the down-stream task marginally ... (somehow surprising)
  4. Lastly, were any measures taken regarding the bn stats during the test time adaptation? (freezing bn due to single batch size, ...)

Many thanks and looking forward to hearing from you :)