YuliangXiu / MobilePose

Light-weight Single Person Pose Estimator
http://xiuyuliang.cn
632 stars 148 forks source link

Error in model inference due to dsntnn package #22

Closed MetaDev closed 5 years ago

MetaDev commented 5 years ago

I get an error in the dsntnn library when I try to do inference on a model.

modelname = "mobilenetv2"
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

net = CoordRegressionNetwork(n_locations=16, backbone=modelname).to(device)
net(torch.ones([1,3, 224,224], dtype=torch.float, device=device))

File "/Users/Harald/Github/MobilePose-pytorch/inference_test.py", line 16, in net(torch.ones([1,3, 224,224], dtype=torch.float, device=device)) File "/Users/Harald/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 477, in call result = self.forward(*input, **kwargs) File "/Users/Harald/Github/MobilePose-pytorch/network.py", line 55, in forward coords = dsntnn.dsnt(heatmaps) File "/Users/Harald/anaconda3/lib/python3.6/site-packages/dsntnn/init.py", line 79, in dsnt return soft_argmax(heatmaps) File "/Users/Harald/anaconda3/lib/python3.6/site-packages/dsntnn/init.py", line 67, in soft_argmax return linear_expectation(heatmaps, values).flip(-1) RuntimeError: expected flip dims axis >= 0, but got min flip dims=-1

MetaDev commented 5 years ago

I found that updating all my packages helped with this error.