jackyjsy / CVPR21Chal-SLR

This repo contains the official code of our work SAM-SLR which won the CVPR 2021 Challenge on Large Scale Signer Independent Isolated Sign Language Recognition.
Creative Commons Zero v1.0 Universal
205 stars 51 forks source link

Conv3D for WLASL #14

Closed EvgeniaChroni closed 2 years ago

EvgeniaChroni commented 2 years ago

Hello thank you so much for your code.

I am trying to train Conv3D for WLASL dataset but the accuracy is too low. Do you have any idea why this is happening ?

Thank you in advance.

jackyjsy commented 2 years ago

Hi there, thanks for reaching out. Can you elaborate more details of your training settings (e.g. learning rate, pretrained model, epochs, and batchsize)? What's the accuracy you got?

EvgeniaChroni commented 2 years ago

Yeah sure , I use Sign_Isolated_Conv3D_clip.py to run the code. num_classes = 2000 epochs = 100 batch_size = 32 learning_rate = 1e-3

I also use the pretrained model that you provide but I train only the last layer (model.fc1 = nn.Linear(model.fc1.in_features, num_classes)).

The accuracy is about 10%

Thank you again

jackyjsy commented 2 years ago

I see. So I think you can try to train the whole network instead. (load the pretrained model, pop the last fc layer, and append a new fc layer with 2000 classes) For your reference, I trained the whole network and got 47.51% top-1 per-instance accuracy for RGB frames.

EvgeniaChroni commented 2 years ago

Ok thank you for the info.

I also used gen_frames.py for data preprocessing and same sample_duration=32

Are you planning to provide code also for WLASL?

jackyjsy commented 2 years ago

I did the same thing for data preprocessing as well. Yes, we are preparing a journal extension of this workshop paper. A preprint version with code will be released shortly (targeting this month).

LiangSiyv commented 2 years ago

I did the same thing for data preprocessing as well. Yes, we are preparing a journal extension of this workshop paper. A preprint version with code will be released shortly (targeting this month). Already looked at your preprint version, and I've learned a lot from your current code. I am looking forward to your released code! Thank you for your share!