Xilinx / Vitis-AI

Vitis AI is Xilinx’s development stack for AI inference on Xilinx hardware platforms, including both edge devices and Alveo cards.
https://www.xilinx.com/ai
Apache License 2.0
1.47k stars 630 forks source link

How can I get OPS for compiled xmodels? #681

Closed Richsheep closed 2 years ago

Richsheep commented 2 years ago

Is there anyway or any tool to find out OPS of xmodels? Or only could calculate FLOPS of float model.

I remember in DNNDK 3.1, GOPS will shown at model compile result.

image

How can I get this param in Vitis-AI-1.3.2 for caffe/tensorflow compiled models?

thanks a lot.

wanghong4compiler commented 2 years ago

Hi @Richsheep, if you want get more info about the compiled model, you could use "xir" tool. One method of its use is as follows: xir dump_txt [] e.g. xir dump_txt a.xmodel a.txt when is missing, it dumps to standard output. the txt file would provide a lot of details. for code size, you can seach "mc_code", for parameter size, you can search "reg_id_to_size". but unfortunately, the workload info is not available until Vitis-AI-2.0.

Richsheep commented 2 years ago

Hi @Richsheep, if you want get more info about the compiled model, you could use "xir" tool. One method of its use is as follows: xir dump_txt [] e.g. xir dump_txt a.xmodel a.txt when is missing, it dumps to standard output. the txt file would provide a lot of details. for code size, you can seach "mc_code", for parameter size, you can search "reg_id_to_size". but unfortunately, the workload info is not available until Vitis-AI-2.0.

Hi @wanghong4compiler Thanks,but still confusing why workload is removed in vitis-ai,and why models in model-zoo have workload marked.

wanghong4compiler commented 2 years ago

Hi @Richsheep , sorry for the incorrect info, you can also get workload info. sorry again.

Richsheep commented 2 years ago

Hi @Richsheep , sorry for the incorrect info, you can also get workload info. sorry again.

Hi @wanghong4compiler How to get workload info? Have to find any other tools?

wanghong4compiler commented 2 years ago

Hi @Richsheep , xir is a open source repo. you can get the source code from xilinx git. this depend on unilog repo, which is also a open source repo, and also in xilinx git. if you use the docker image, you can get xir directly without doing anything.

Richsheep commented 2 years ago

Hi @Richsheep , xir is a open source repo. you can get the source code from xilinx git. this depend on unilog repo, which is also a open source repo, and also in xilinx git. if you use the docker image, you can get xir directly without doing anything.

Many thanks, I found workload info in xir dump_txt output, it seems in subg_root

image

Thanks again for your support!

Serendi-pity1995 commented 2 years ago

Hi, i also those info with the tool xir dump_txt . But i dont know which number represent Gops. What is the conversion relationship between them? I used the tool xir to generate the info of the model "tf2_efficientnet-b0_imagenet_224_224_0.36G_2.0" but the numbers here do not match it (0.36G) image

thanks!

ahnu-zhaorong commented 4 months ago

My question is the same. How do I find the detailed documentation for OPS? The information of dump_txt provided is not stated in any way!