Open busyyang opened 6 months ago
Sorry, I will add relevant scripts when I have time. For now, you can refer to https://github.com/dinglufe/segment-anything-cpp-wrapper.
Have you modified the official export onnx code?when I export tensorrt,that reports a error: Can you show me onnx version?
[04/17/2024-14:59:10] [E] [TRT] ModelImporter.cpp:768: While parsing node number 1209 [Slice -> "/Slice_2_output_0"]: [04/17/2024-14:59:10] [E] [TRT] ModelImporter.cpp:769: --- Begin node --- [04/17/2024-14:59:10] [E] [TRT] ModelImporter.cpp:770: input: "/Resize_output_0" input: "/Constant_70_output_0" input: "/Unsqueeze_19_output_0" input: "/Constant_72_output_0" input: "/Constant_73_output_0" output: "/Slice_2_output_0" name: "/Slice_2" op_type: "Slice"
[04/17/2024-14:59:10] [E] [TRT] ModelImporter.cpp:771: --- End node --- [04/17/2024-14:59:10] [E] [TRT] ModelImporter.cpp:774: ERROR: ModelImporter.cpp:195 In function parseGraph: [6] Invalid Node - /Slice_2 [graphShapeAnalyzer.cpp::demandAndResolveInputTensor::1247] Error Code 4: Internal Error (orig_im_size: network input that is shape tensor must have type Int32)
my tensorrt version is 8.6.0,can you give me : torch and onnx version
I convert onnx from models to tensorRT engine successfully with command:
trtexec.exe --onnx=path\mobile_sam_encoder.onnx --saveEngine=path\mobile_sam_encoder.trt
@m-wei you can have a try with these weight files.
and @zhudongwork can you add the code to get encoder/decoder onnx weights from mobile_sam.pt?
Version of TensorRT is 8.6.1, on Windows 11.
@busyyang these weight files is successful, my problem exported onnx model is from official MobileSAM library
@busyyang my question is solved, you can compile nanosam nanosam/mobile_sam/utils/onnx.py, this code don't export mask node,I guess the node has some bug
Glad to hear that. And I have a try with export_onnx_model.py to convert to onnx successfully.
MobileSAM/scripts/export_onnx_model.py --checkpoint ../weights/mobile_sam.pt --output ../weights/mobile_sam.onnx --model-type vit_t
But it is not separate encoder and decoder weights.
@busyyang nonono, this export code is only decoder model, encoder model is from https://github.com/dinglufe/segment-anything-cpp-wrapper, this library's exported model's name is xx_preprocess.onnx,this is actually encoder model.
In MobileSAM, there is only one weight file named mobile_sam.pt, how to get encoder and decoder weigt separately? Could you share the script for that?