I've followed this https://github.com/Xilinx/Vitis-AI-Tutorials/tree/1.4/Design_Tutorials/07-yolov4-tutorial (using ./docker_run.sh xilinx/vitis-ai-cpu:1.4.916 because it requires Vitis 1.4) to quantize and compile a darknet trained yolov4 on my custom data. I tried the caffe tutorial. When i get to the final step to make the inference on the FPGA ZCU102 I get this error:
[libprotobuf ERROR google/protobuf/text_format.cc:323] Error parsing text-format vitis.ai.proto.DpuModelParamList: 1:5: Message type "vitis.ai.proto.DpuModelParamList" has no field named "name".
WARNING: Logging before InitGoogleLogging() is written to STDERR
F1119 17:30:48.142455 1033 configurable_dpu_task_imp.cpp:142] Check failed: ok cannot parse config file. config_file=dpu_yolov4_voc_andante.prototxt
Check failure stack trace:
Aborted
Do you know what this mean? Please give me a hand, it driving me crazy.
Hi @vongracia, could you ask this question in the Vitis-AI tutorial repo or Xilinx forums? This issue doesn't seem to relate to the Vitis Acceleration flow. Engineers watching these two places may help you.
Hi folks,
I've followed this https://github.com/Xilinx/Vitis-AI-Tutorials/tree/1.4/Design_Tutorials/07-yolov4-tutorial (using ./docker_run.sh xilinx/vitis-ai-cpu:1.4.916 because it requires Vitis 1.4) to quantize and compile a darknet trained yolov4 on my custom data. I tried the caffe tutorial. When i get to the final step to make the inference on the FPGA ZCU102 I get this error:
[libprotobuf ERROR google/protobuf/text_format.cc:323] Error parsing text-format vitis.ai.proto.DpuModelParamList: 1:5: Message type "vitis.ai.proto.DpuModelParamList" has no field named "name". WARNING: Logging before InitGoogleLogging() is written to STDERR F1119 17:30:48.142455 1033 configurable_dpu_task_imp.cpp:142] Check failed: ok cannot parse config file. config_file=dpu_yolov4_voc_andante.prototxt Check failure stack trace: Aborted
Do you know what this mean? Please give me a hand, it driving me crazy.
Here the 4 files (file.prototxt, file.xmodel, md5sum.txt, meta.json) produced at the end. I used only the prototxt and xmodel to run the inference: https://wetransfer.com/downloads/032432a82712d0d99a67cc0c2d6beabb20220928144618/267f8e
Please take a look, maybe you can find something Thanks