Open goutamyg opened 6 months ago
Also, can you please suggest what /path/to/model_best.pth.tar
should be in
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 --master_port=1234 tools/train.py --config config/NvGesture.yml --nprocs 1 --eval_only --resume /path/to/model_best.pth.tar
to evaluate the model on NvGesture dataset?
Also, can you please suggest what
/path/to/model_best.pth.tar
should be in
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 --master_port=1234 tools/train.py --config config/NvGesture.yml --nprocs 1 --eval_only --resume /path/to/model_best.pth.tar
to evaluate the model on NvGesture dataset?
Hi,
Thank you for your attention to our work and for pointing this out. We will fix this bug as soon as possible. In the meantime, you can try using this code.
@zhoubenjia Thank you for your kind response. Since I am interested with NvGestures dataset, can you please release the corresponding models for your PAMI paper? I appreciate your help.
@zhoubenjia Thank you for your kind response. Since I am interested with NvGestures dataset, can you please release the corresponding models for your PAMI paper? I appreciate your help.
Access following link to get the NvGestures parameters. https://drive.google.com/drive/folders/1YMkVDX44cXVzlBQdYHHcDr0Cr7gd7HsY
Thank you for sharing the model files
Hi @zhoubenjia Thank you for sharing your code. However, I am getting an error during model inference on the NVGestures test dataset. The error occurs here since none of the contours have an area greater than 500. Can you please suggest how to fix this error?
I was also curious whether you use different pre-processing methods for depth maps from NVGestures and NTU-RGBD datasets? If yes, can you please share the pre-processing code for NVGestures dataset? Thank you.