bmartacho / OmniPose

This is the official PyTorch implementation for “OmniPose: A Multi-Scale Framework for Multi-Person Pose Estimation”.
Other
62 stars 8 forks source link

Could you provide more info on how to run the inference mode? #6

Open filippuo2000 opened 1 year ago

filippuo2000 commented 1 year ago

So I was trying to run the inference on sample images (not from the coco or mpii dataset). I uploaded the pretrained model to 'weights/coco/OmniPose_w48_v2_128/model_best.pth' and uncommented the appropriate cfg file in run_demo.sh. However I'm not sure if and what exactly should be provided for other arguments when running inference.py. Tried parsing the path to the pretrained model as MODELDIR, but that does not seem to be working. I was also trying set the MODEL_FILE in omnipose_w48_128x96.yaml to either the pretrained model path or the ominpose.py model. After all only got some dictionary related issues. Always ended up with this error:

'AttributeError: 'collections.OrderedDict' object has no attribute 'state_dict' in inference.py, line

Would really aprecciate it if anyone could provide some more details on how to run the network in the inference mode, as it's not that obvious to me at least.