Xilinx / Vitis-AI-Tutorials

MIT License
358 stars 144 forks source link

Can't use custom Yolov4 model with the Vitis AI Library (no prototxt file) #22

Open Luc-Meunier opened 3 years ago

Luc-Meunier commented 3 years ago

Hi,

I am working on using the Vitis AI library with a custom Yolov4 model.

I have followed the steps of this tutorial (convert Darknet to TensorFlow, freeze, quantize, compile) : https://github.com/Xilinx/Vitis-Tutorials/tree/master/Machine_Learning/Design_Tutorials/07-yolov4-tutorial

I am using an Alveo U280 card, the Vitis AI Docker Image for CPU, and the TensorFlow 1 framework.

To deploy the model, I copied the folder obtained from the compilation step to the path " /usr/share/vitis_ai_library/models" (let "yolov4" be the name of the custom model and output folder) in order to be read by the Vitis AI library.

Here is the content of the folder :

image

And here is the content of a standard model from Model Zoo (https://github.com/Xilinx/Vitis-AI/blob/master/models/AI-Model-Zoo/model-list/dk_yolov3_bdd_288_512_53.7G_1.3/model.yaml) :

image

It seems that the meta.json "replaces" the model.prototxt.

I then ran the example code from https://github.com/Xilinx/Vitis-AI/tree/master/demo/Vitis-AI-Library/samples/yolov4

cd /usr/share/vitis_ai_library/samples/yolov4 
./test_video_yolov4 yolov4

Here is the error message when I try to run the application.

image

The model name is the parameter of the following line of code :

vitis::ai::YOLOv3::create(model);

Maybe I am missing an argument when I run the vai_c_tensorflow command when compiling the model.

vai_c_tensorflow \
        --frozen_pb  ${QUANT}/quantize_eval_model.pb \
        --arch       ${ARCH} \
        --output_dir ${COMPILE} \
        --net_name   ${MODEL_NAME} \
        --options "{'mode':'normal','save_kernel':'', 'input_shape':'1,416,416,3'}"  

I would greatly appreciate your help. Best regards,

Luc

nhphuong91 commented 2 years ago

@Luc-Meunier The meta.json file is always there after the model has been successfully compiled. About the file <model>.prototxt, I find it to be the most tricky one: you have to either copy it from precompiled model for other device (zcu102-zcu104 for example) or write it themselves. You can download precompiled model here and copy it's <model>.prototxt to your output model

Update: Sorry, I forgot to add link to instruction for self-writing *.prototxt file. You can takge reference here (Page 71 – Chapter 4 | Section: Using the Configuration File)

bhargavin1872008 commented 1 year ago

when running the requirements.txt i 'm getting error for coremltools.it is showing like "couldn't find a version that satisfies the requirement tensorflow<=1.14 and tensorflow >=1.5(from tfcoremltools -r requirements.txt).(from version :2.2.0,2.2..1, 2.2.2, ...2.7.0rc0,2.7.0.rc1............) like this .can someone help me regarding this. Also ,i have a doubt .can we use ubuntu 20.04 ,cuda 11.7 ,cudnn 8.4.0 for this project. or have to use ubuntu 18.04,cuda 10.0 only which only works.please help me regarding this,i have less time in my hand.

nhphuong91 commented 1 year ago

@bhargavin1872008 Are you using nvidia RTX3xxx card? If yes, please use VAI 2.0 or 2.5 -> it is VAI version which support cuda 11 & cudnn 8

bhargavin1872008 commented 1 year ago

Thanks for connecting with me.im using quadro p1000/nvidiaco4 Gp107GL. The code in Vitis-aiyolo v4 readme .is that belong to tensorflow 1.x or tensorflow2.x

On Fri, Aug 19, 2022, 10:36 nhphuong91 @.***> wrote:

@bhargavin1872008 https://github.com/bhargavin1872008 Are you using nvidia RTX3xxx card? If yes, please use VAI 2.0 or 2.5 -> it is VAI version which support cuda 11 & cudnn 8

— Reply to this email directly, view it on GitHub https://github.com/Xilinx/Vitis-AI-Tutorials/issues/22#issuecomment-1220257351, or unsubscribe https://github.com/notifications/unsubscribe-auth/A2CH5GPQQIOBOW33EOXUS2DVZ4I65ANCNFSM46W4GR2Q . You are receiving this because you were mentioned.Message ID: @.***>

bhargavin1872008 commented 1 year ago

nhphoung please reply for question

can we install cuda10.0 on ubuntu 20.04

nhphuong91 commented 1 year ago

@bhargavin1872008 For full support matrix, you can refer to here: https://docs.nvidia.com/deeplearning/cudnn/support-matrix/index.html From that, I suppose your current configuration would still work if you use Vitis-AI 2.0 or 2.5.0 GPU docker for yolov4 tutorial. (NOTE: If you have trouble building VAI GPU docker, refer to my temporary fix here: https://github.com/Xilinx/Vitis-AI/issues/743 About the TF version used in that tutorial, it's Tensorflow 1.x

bhargavin1872008 commented 1 year ago

i have successfully installed docker engine and adder the user in the group user as docker documentation.im using vitis-AI 1.3 only thing is when building xilinx/Vitis-Ai from docker receipes cd setup/docker ./docker_build_gpu.sh

./docker_run.sh xilinx/vitis-ai-gpu:latest

im unable to build Vitis-AI workspace is not getting created.

Instead can i use the prebuilt docker image docker pull xilinx/vitis-ai-cpu:latest
./docker_run.sh xilinx/vitis-ai-cpu:latest.my doubt is using this cpu version for building docker image doesnt effect my project which require graphical drivers for cuda,cudnn etc... please clarify this

nhphuong91 commented 1 year ago

@bhargavin1872008 the quantizing step would need GPU to run faster. VAI docker CPU would still work anyway. VAI 1.3 is too old. Should use 1.4 at least, which I've tried & confirmed to work. The tutorial also states that it works in VAI 1.4.

bhargavin1872008 commented 1 year ago

Hi,

I am working on using the Vitis AI library with a custom Yolov4 model.

I have followed the steps of this tutorial (convert Darknet to TensorFlow, freeze, quantize, compile) : https://github.com/Xilinx/Vitis-Tutorials/tree/master/Machine_Learning/Design_Tutorials/07-yolov4-tutorial

I am using an Alveo U280 card, the Vitis AI Docker Image for CPU, and the TensorFlow 1 framework.

To deploy the model, I copied the folder obtained from the compilation step to the path " /usr/share/vitis_ai_library/models" (let "yolov4" be the name of the custom model and output folder) in order to be read by the Vitis AI library.

Here is the content of the folder :

image

And here is the content of a standard model from Model Zoo (https://github.com/Xilinx/Vitis-AI/blob/master/models/AI-Model-Zoo/model-list/dk_yolov3_bdd_288_512_53.7G_1.3/model.yaml) :

image

It seems that the meta.json "replaces" the model.prototxt.

I then ran the example code from https://github.com/Xilinx/Vitis-AI/tree/master/demo/Vitis-AI-Library/samples/yolov4

cd /usr/share/vitis_ai_library/samples/yolov4 
./test_video_yolov4 yolov4

Here is the error message when I try to run the application.

image

The model name is the parameter of the following line of code :

vitis::ai::YOLOv3::create(model);

Maybe I am missing an argument when I run the vai_c_tensorflow command when compiling the model.

vai_c_tensorflow \
      --frozen_pb  ${QUANT}/quantize_eval_model.pb \
      --arch       ${ARCH} \
      --output_dir ${COMPILE} \
      --net_name   ${MODEL_NAME} \
      --options "{'mode':'normal','save_kernel':'', 'input_shape':'1,416,416,3'}"  

I would greatly appreciate your help. Best regards,

Luc

             can u clarity on which tensorflow version we are using 1.x or 2.x.
             didnt u fine an error like when running the requirements.txt of keras-yolov3-set. i 'm getting error for coremltools.it is showing like

           "couldn't find a version that satisfies the requirement tensorflow<=1.14 and tensorflow >=1.5(from tfcoremltools -r requirements.txt).(from version :2.2.0,2.2..1, 2.2.2, ...2.7.0rc0,2.7.0.rc1............) like this   ".       how did fix it.

            can someone help me regarding this. Also ,i have a doubt .can we use ubuntu 20.04 ,cuda 11.7 ,cudnn 8.4.0 for this project or have to use ubuntu 18.04,cuda 10.0 only which only works.please help me regarding this,i have less time in my hand.
nhphuong91 commented 1 year ago

@bhargavin1872008 no, the meta.json is not a replacement for <model>.prototxt file. You can try any yolov4 *.prototxt file from VAI model-zoo as an example to run your compiled model. For instruction of how to create your *.prototxt file, refer to here: https://docs.xilinx.com/r/en-US/ug1354-xilinx-ai-sdk/Using-the-Configuration-File