GibranBenitez / IPN-hand

Code and models of our arXiv paper "IPN Hand: A Video Dataset and Benchmark for Real-Time Continuous Hand Gesture Recognition"
https://gibranbenitez.github.io/IPN_Hand/
MIT License
87 stars 21 forks source link

Unable to reproduce results of ResNeXt101 #12

Open JhonarraonCSDN opened 1 year ago

JhonarraonCSDN commented 1 year ago

Thanks for your great work.I am trying to reproduce the results of ResNeXt101 on IPN-hand, However, I am unable to do so and I would appreciate some help. i followed the training script run_clf_ipn_trainRex-js32b32.sh under the folder test/,here are my python args

python main.py --root_path . --video_path datasets/HandGestures/IPN_dataset --annotation_path annotation_ipnGesture/ipnall_but_None.json --result_path results_ipn --pretrain_path report_ipn/ResNeXt101/shared_models_v1/models/jester_resnext_101_RGB_32.pth --pretrain_dataset jester --dataset ipn --sample_duration 32 --learning_rate 0.01 --model resnext --model_depth 101 --resnet_shortcut B --batch_size 384 --n_classes 13 --n_finetune_classes 13 --n_threads 16 --checkpoint 1 --modality RGB --train_crop random --n_val_samples 1 --test_subset test --n_epochs 100 --store_name ipnClf_jes32r_b32

i trained for 100 epoch,my best val acc is 65%,In your paper it is 83%, and I was able to achieve around 83% validation accuracy using the pre-trained weights of ResNeXt101. However, I am unsure if I made any mistakes in my settings or if there are any specific training strategies that could improve the accuracy further. opts_ipnClf_jes32r_b32_resnext-101.txt