hyf015 / egocentric-gaze-prediction

Code for the paper "Predicting Gaze in Egocentric Video by Learning Task-dependent Attention Transition"
62 stars 18 forks source link

How to run testing? #16

Closed kazucmpt closed 5 years ago

kazucmpt commented 5 years ago

We have finished training. So next, we want to test. In order to test, should we run

python3 gaze_full.py --flowPath ../gtea_imgflow_pre --imagePath ../gtea_images --fixsacPath ../fixsac --gtPath ../gtea_gts --pretrained_model save/best_SP.pth.tar --pretrained_lstm save/best_lstm.pth.tar --pretrained_late save/best_late.pth.tar --extract_lstm --extract_late

Is it correct? In this case, will SP module work?

hyf015 commented 5 years ago

If you have finished training everything, files for testing should have already been extracted. Thus you don't need to extract the intermediate files again.

kazucmpt commented 5 years ago

I want to do only testing. But without modifying gaze_full.py as I said in other question, I think it does not work, right?

hyf015 commented 5 years ago

I think it's a good idea to modify the scripts for your purpose.

kazucmpt commented 5 years ago

Thank you. But why don't you modify it? It helps many people.

hyf015 commented 5 years ago

Because I'm also using this repo for other purposes, sorry.

kazucmpt commented 5 years ago

I see.