de-code / python-tf-bodypix

A Python implementation of the bodypix model.
MIT License
124 stars 20 forks source link

Can not download the model #150

Closed MrRiahi closed 2 years ago

MrRiahi commented 2 years ago

Thanks for this great repository. I face a weird problem and could not solve it. I run the Python API code, When it wants to download the model an error is raised. By default, the BodyPixModelPaths.MOBILENET_FLOAT_50_STRIDE_16 is download. After downloading the model architecture in the "download_model" function, it can not open it as a JSON file. The error is:

UnicodeDecodeError: 'utf-8' codec can't decode byte 0x8b in position 1: invalid start byte

After simple checking, I found the error comes from the following code in the "download_model" function.

with open(local_model_json_path, 'r', encoding='utf-8') as model_json_fp: model_json = json.load(model_json_fp)

I think the modle.json file which download is not correct. Can you help me how to fix it? Thanks a lot for your help

MrRiahi commented 2 years ago

When I download .json and .bin files and put them in the correct directory, the rest of the code works well.

de-code commented 2 years ago

Hi @MrRiahi thank you for raising the issue. The BodyPixModelPaths.MOBILENET_FLOAT_50_STRIDE_16 URL will resolve to https://storage.googleapis.com/tfjs-models/savedmodel/bodypix/mobilenet/float/050/model-stride16.json. That URL itself seem to function at least at the moment.

In fact the automated tests do also download and run the default model and it is currently also experiencing the same issue: https://github.com/de-code/python-tf-bodypix/runs/4292852689?check_suite_focus=true

It appears something has changed, and the server is now responding with a gzip compressed file regardless of whether the client actually supports that.

The same can be reproduced using curl:

curl https://storage.googleapis.com/tfjs-models/savedmodel/bodypix/mobilenet/float/075/model-stride16.json
# results in binary output
curl --output /dev/stdout https://storage.googleapis.com/tfjs-models/savedmodel/bodypix/mobilenet/float/075/model-stride16.json | zcat
# decodes it to json

In the browser it is working, because the client accepts and handles gzip.

de-code commented 2 years ago

This should be fixed in v0.3.8. Existing downloads may still cause an issue. In that case you could just delete the files causing the error (the exception would now mention the local file path).

Could you please confirm and close if you think it is resolved?

MrRiahi commented 2 years ago

Thanks for your reply. I have another question. I want to run this code on raspberry pi 4. In the first step, I should convert this model to tflite model. I use the following link https://www.tensorflow.org/lite/convert/index to convert the model but it raises the following error. OSError: SavedModel file does not exist at: ./models/mobilenet-float16-stride16/{saved_model.pbtxt|saved_model.pb}

I also use the following code for model conversion python -m tf_bodypix \ convert-to-tflite \ --model-path \ "https://storage.googleapis.com/tfjs-models/savedmodel/bodypix/mobilenet/float/075/model-stride16.json" \ --optimize \ --quantization-type=float16 \ --output-model-file "./mobilenet-float16-stride16.tflite" it works and the model convert to tflite but when I want to use it in tensorflow lite based on this link https://www.tensorflow.org/lite/guide/inference, I found that the input model dimension is array([1, 1, 1, 3], dtype=int32). Can you help me to solve this problem?

de-code commented 2 years ago

As this seems unrelated to the original issue, I would suggest to create a separate issue with your question. Then we can close this issue.