huhengtong / UKD_CVPR2020

The source code for the CVPR2020 paper "Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing".
20 stars 12 forks source link

What is the GPU used in your experiment? #4

Open limycml opened 4 years ago

limycml commented 4 years ago

What is the GPU used in your experiment? GTX1070TI (8 g) prompt insufficient graphics memory (ResourceExhaustedError) I am looking forward to your reply

huhengtong commented 4 years ago

I used a GTX TITA X GPU. You can try a small batch size.

limycml commented 4 years ago

Do you used a single or multiple GPU card? It feels that the single card should not support batch size=256?

limycml commented 4 years ago

What effect does smaller batch size have on the result (MAP)?

limycml commented 4 years ago

I'm using a GTX1070TI(8G) graphics card, Train phase: Only batch size=32 will not report an error, 64 or higher will report an error(ResourceExhaustedError). Test Phase: The batch size=64 alone will not report an error and 128 will report an error(ResourceExhaustedError).

huhengtong commented 4 years ago

I used a single GPU. The small batch size will lead to the slow running speed, but not sure the low accuracy. You need to select a proper batch size according to your GPU.

huhengtong commented 4 years ago

I have just update the teacher_model file for the former version has some problems. Please follow the new version if convenient.

limycml commented 4 years ago

Thank you for your patience and for updating the teacher_model file