VeriSilicon / TIM-VX

VeriSilicon Tensor Interface Module
Other
221 stars 84 forks source link

UINT8 model could not run on the A311D NPU! #412

Closed 2050airobert closed 2 years ago

2050airobert commented 2 years ago

Hi,

  1. Can PTH or PT models with pytorch quantified by pytorch be directly run on rk3399 arm and rk3399pro NPU?
  2. If I use other formats of quantitative models, such as onnx, how can I run on rk3399 and rk3399rpo npus?
  3. what formats of quantitative models does rk3399pro NPU support? Or does it mean that the model must be converted to the rockchip framework before further quantification and finally deployed on the board?

BR

sunshinemyson commented 2 years ago

@2050airobert ,

You can convert model to tflite format then inference it with our tflite-vx-delegate. For 3399pro. we can support uint8_asymm quantized model natively. You don't have to convert it to rockchip format - tflite format is ok.

Thanks

2050airobert commented 2 years ago

@sunshinemyson The rockchip3399pro support uint8 quant model with pytorch qat quantanization,right? Do you know if I have the uint8 model quantanized with pytorch tool, how could I direnct make it run on the 3399pro NPU (verisilicon a311d NPU )?

2050airobert commented 2 years ago

anyone could help ,tks?

zjd1988 commented 2 years ago

@2050airobert @sunshinemyson rk3399pro's NPU use VeriSilicon's ip core? is there a full chip list suppor VeriSilicon's ip core?

sunshinemyson commented 2 years ago

@2050airobert ,

If you have uint8 model from pytorch, you can convert it with AcuityLite tool to tflite, then you can run it with vx-delegate + tim-vx.

https://pypi.org/project/acuitylite/

sunshinemyson commented 2 years ago

@2050airobert @sunshinemyson rk3399pro's NPU use VeriSilicon's ip core? is there a full chip list suppor VeriSilicon's ip core?

Most customer of VSI doesn't publish their chip for 3rd-party developer. Beside RK and AMLogic(Vim3), NXP 8mp dev-kit available at https://detail.tmall.com/item.htm?spm=a230r.1.14.3.412e177d828y5B&id=653946586608&ns=1&abbucket=19.

2050airobert commented 2 years ago

1 You mean not only pytorch pth or pt model could be converted to tf model,but pytorch uint8 model even could be converted to tflite? 2 Are you sure of that mentioned above? 3 Is there more problem in the converting process during converting pytorch uint8 model to tflite model, could you show more successful case or interal testcase ?

sunshinemyson commented 2 years ago

@2050airobert ,

Pytorch model -> ONNX Model -> Tflite Model is the only path we can support pytorch model with tim-vx today. I can not guarantee any model can be converted successfully, but if we can fix it if you can share the failure case.

Thanks.