ajaichemmanam / simple_bodypix_python

A simple and minimal bodypix inference in python
76 stars 19 forks source link

Unable to reshape segment, due to the third dimension. #4

Closed terry30207 closed 4 years ago

terry30207 commented 4 years ago

Error message: Traceback (most recent call last): File "C:\Users\user\source\repos\PythonApplication1\PythonApplication1\evalbody_singleposemodel.py", line 155, in segmentationMask, (segmentationMask.shape[0], segmentationMask.shape[1])) File "<__array_function__ internals>", line 6, in reshape File "C:\Program Files\Python37\lib\site-packages\numpy\core\fromnumeric.py", line 301, in reshape return _wrapfunc(a, 'reshape', newshape, order=order) File "C:\Program Files\Python37\lib\site-packages\numpy\core\fromnumeric.py", line 58, in _wrapfunc return _wrapit(obj, method, *args, *kwds) File "C:\Program Files\Python37\lib\site-packages\numpy\core\fromnumeric.py", line 47, in _wrapit result = getattr(asarray(obj), method)(args, **kwds) ValueError: cannot reshape array of size 128 into shape (2,2)

The problem is, with bodypix/mobilenet/float/075/model-stride16.json, the model returns a segment that has four dimentions. After "segments = np.squeeze(results[6], 0) ", there are still three dimensions. And no matter what the input image is, the length of third dimension will always be 32. I've tried with a 16*16 image, and the output tensor look like: displacement_bwd (1, 2, 2, 34) displacement_fwd (1, 2, 2, 1) heatmaps (2, 2, 24) longoffsets (2, 2, 34) offsets (2, 2, 17) partHeatmaps (2, 2, 32) segments (2, 2, 32) partOffsets (2, 2, 48) maskshape (2, 2, 32) Still has no idea what does the third dimension mean, but it produce a ValueError.

BTW, with ResNet, I got "ValueError: buffer is smaller than requested size"

ajaichemmanam commented 4 years ago

The image you are giving is too small. Try with a larger image.

For a 480x640 image, following shapes are obtained from a resnet50 stride16 model.

displacement_bwd (1, 31, 41, 32) displacement_fwd (1, 31, 41, 32) heatmaps (31, 41, 17) longoffsets (31, 41, 34) offsets (31, 41, 34) partHeatmaps (31, 41, 24) segments (31, 41, 1) partOffsets (31, 41, 48) maskshape (31, 41, 1)

2x2 resolution is not big enough. Please check and let me know if issue persists. Let me know the tensorflow version, and if the problem exists for any specific model.

terry30207 commented 4 years ago

Thanks for reply. I've tried with a 1920x1440 and a 1280x720 jpeg image. But still get ValueError: cannot reshape array of size 352352 into shape (91,121) and ValueError: cannot reshape array of size 119232 into shape (46,81) I have tensorflow 2.1.0 with bodypix/mobilenet/float/075/model-stride16.json And I'm still trying to figure out why resnet50 stride16 model raise an ValueError: buffer is smaller than requested size

ajaichemmanam commented 4 years ago

I was able to reproduce the first issue with mobilenet model and has been fixed in the last commit. The posenet was having different output tensors for resnet and mobilenet models. Please check and report.

ajaichemmanam commented 4 years ago

I still have no idea about the buffer error. It works fine on my setup. Please help me reproduce the issue.

terry30207 commented 4 years ago

Thanks for the fix, working fine now. I couldn't find the reason cause the buffer error, but I tried to delete and setup a new conda environment, the issue just disappeared. Now I can use both MobileNet and resnet50 without any problem. I upgrade to tensorflow 2.2.0 while setting up new environment, perhaps that fix the issue?

Also, just remind those who want to use MobileNet, since MobileNet requires different preprocess with resnet50, make sure to replace m = np.array([-123.15, -115.90, -103.06]) x = np.add(x, m) with x=(x/127.5)-1

ajaichemmanam commented 4 years ago

Thanks for reminding me about the preprocessing.