rockchip-linux / rknpu2

BSD 3-Clause "New" or "Revised" License
622 stars 130 forks source link

How to inference with multi-batch??! #78

Open Zhangzhenzhou96 opened 1 year ago

Zhangzhenzhou96 commented 1 year ago

How to inference with multi-batch? Is any multi-batch inference example with C api provided? The docs are unclear and obscure. what is the meaning of model input height and how to set h_stride? provide an example plz! Eng_pic

Uhao-P commented 1 year ago

同问?

crab2rab commented 10 months ago

同问

crab2rab commented 10 months ago

How to inference with multi-batch? Is any multi-batch inference example with C api provided? The docs are unclear and obscure. what is the meaning of model input height and how to set h_stride? provide an example plz! Eng_pic

https://github.com/crab2rab/RKNN-YOLOV5-BatchInference-MultiThreading.git 研究了一天,实现了batch推理,欢迎测试和提出问题

crab2rab commented 10 months ago

同问?

https://github.com/crab2rab/RKNN-YOLOV5-BatchInference-MultiThreading.git 研究了一天,实现了batch推理,欢迎测试和提出问题

S0soo commented 2 months ago

同问?

https://github.com/crab2rab/RKNN-YOLOV5-BatchInference-MultiThreading.git 研究了一天,实现了batch推理,欢迎测试和提出问题

请问多batch推理后,每个batch结果都一样可能是什么原因导致的呢