dinglufe / segment-anything-cpp-wrapper

MIT License
221 stars 34 forks source link

export_pre_model.py is not working anymore #44

Closed DennisSenftleben closed 6 months ago

DennisSenftleben commented 6 months ago

I have downloaded the sam hq models and executed this exporter script. This error occurs because the original segement_anything is imported. If I change the code of the exporter to segement_anything_hq, the script runs without error, but does not export any models.

Could you please upload the exported models to github?

s:\Sources\Master\segment-anything-cpp-wrapper>python export_pre_model.py Traceback (most recent call last): File "s:\Sources\Master\segment-anything-cpp-wrapper\export_pre_model.py", line 56, in sam = SAM.sam_model_registrymodel_type ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\znoopy2k\AppData\Local\Programs\Python\Python311\Lib\site-packages\segment_anything\build_sam.py", line 15, in build_sam_vit_h return _build_sam( ^^^^^^^^^^^ File "C:\Users\znoopy2k\AppData\Local\Programs\Python\Python311\Lib\site-packages\segment_anything\build_sam.py", line 106, in _build_sam sam.load_state_dict(state_dict) File "C:\Users\znoopy2k\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\nn\modules\module.py", line 2041, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for Sam: Unexpected key(s) in state_dict: "mask_decoder.hf_token.weight", "mask_decoder.hf_mlp.layers.0.weight", "mask_decoder.hf_mlp.layers.0.bias", "mask_decoder.hf_mlp.layers.1.weight", "mask_decoder.hf_mlp.layers.1.bias", "mask_decoder.hf_mlp.layers.2.weight", "mask_decoder.hf_mlp.layers.2.bias", "mask_decoder.compress_vit_feat.0.weight", "mask_decoder.compress_vit_feat.0.bias", "mask_decoder.compress_vit_feat.1.weight", "mask_decoder.compress_vit_feat.1.bias", "mask_decoder.compress_vit_feat.3.weight", "mask_decoder.compress_vit_feat.3.bias", "mask_decoder.embedding_encoder.0.weight", "mask_decoder.embedding_encoder.0.bias", "mask_decoder.embedding_encoder.1.weight", "mask_decoder.embedding_encoder.1.bias", "mask_decoder.embedding_encoder.3.weight", "mask_decoder.embedding_encoder.3.bias", "mask_decoder.embedding_maskfeature.0.weight", "mask_decoder.embedding_maskfeature.0.bias", "mask_decoder.embedding_maskfeature.1.weight", "mask_decoder.embedding_maskfeature.1.bias", "mask_decoder.embedding_maskfeature.3.weight", "mask_decoder.embedding_maskfeature.3.bias".

dinglufe commented 6 months ago

The preprocessing model for SAM-HQ is already included in the release 7z file (models/sam_hq_preprocess.onnx, for image size of 1024x720). It can be used directly, for example: ./sam_cpp_test -pre_model="models/sam_hq_preprocess.onnx" -sam_model="models/sam_hq_vit_h.onnx".

dinglufe commented 6 months ago

Perhaps installing SAM-HQ using standard methods within a virtual environment and then invoking export_pre_model.py could work.

DennisSenftleben commented 6 months ago

The standard installation method worked. Thank you!