Open brandonjraven opened 2 years ago
@karimnosseir may you take a look at this issue? Thanks!
Hi @karimnosseir and @lu-wang-g - thanks for looking into this issue. Have you been able to or found any stand out or obvious problem that I have overlooked?
Just checking in - has anyone been able to investigate why this is happening? Any leads on what we could be trying or what is going wrong?
Description
tflite model fails inference check with 'Aborted (Core dumped)` error and no details. This is using the Coral inference code provided in the initially report issue. I was directed here by the Coral team; details of the original issue are found here.
There is no clear indicator why this is failing currently. Our model was a pytorch-trained model converted to onnx, then to tensorflow, then to tensorflow lite, and quantized from float32 to int8 using the a process detailed below. Included are the model's configuration (as detailed by the conditional lane detection repository format), the tflite model, as well as the conversion scripts and associated conda environment details.
The following process was used to get the tflite mode:
python tools/pytorch2onnx.py configs/lane_detection_small_train.py lane-detection-model.pth --out lane-detection-model.onnx
We have been able to run inference on the float32 models and the int conversion model on a local development machine. Maybe you can provide us some insight into why the conversion/quantization appears to be incorrect or failing? Is there a way to get better log information when a core dump occurs, or a troubleshooting technique we are overlooking? Any help would be appreciated. Thanks!
The tflite file is present (simply remove the .txt extension from it to use).
conda_environment.txt lane_detection_conversion_script.py.txt lane-detection-model.tflite.txt lane_detection_small_train.py.txt