So I was trying to run the inference on sample images (not from the coco or mpii dataset). I uploaded the pretrained model to 'weights/coco/OmniPose_w48_v2_128/model_best.pth' and uncommented the appropriate cfg file in run_demo.sh. However I'm not sure if and what exactly should be provided for other arguments when running inference.py. Tried parsing the path to the pretrained model as MODELDIR, but that does not seem to be working. I was also trying set the MODEL_FILE in omnipose_w48_128x96.yaml to either the pretrained model path or the ominpose.py model. After all only got some dictionary related issues. Always ended up with this error:
'AttributeError: 'collections.OrderedDict' object has no attribute 'state_dict' in inference.py, line
Would really aprecciate it if anyone could provide some more details on how to run the network in the inference mode, as it's not that obvious to me at least.
So I was trying to run the inference on sample images (not from the coco or mpii dataset). I uploaded the pretrained model to 'weights/coco/OmniPose_w48_v2_128/model_best.pth' and uncommented the appropriate cfg file in run_demo.sh. However I'm not sure if and what exactly should be provided for other arguments when running inference.py. Tried parsing the path to the pretrained model as MODELDIR, but that does not seem to be working. I was also trying set the MODEL_FILE in omnipose_w48_128x96.yaml to either the pretrained model path or the ominpose.py model. After all only got some dictionary related issues. Always ended up with this error:
'AttributeError: 'collections.OrderedDict' object has no attribute 'state_dict' in inference.py, line
Would really aprecciate it if anyone could provide some more details on how to run the network in the inference mode, as it's not that obvious to me at least.