sonos / tract

Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference
Other
2.18k stars 210 forks source link

onnx inference is slow #1246

Open oaifaye opened 10 months ago

oaifaye commented 10 months ago

I compared onnx with PPOCR on CPU and found that using python was 9 times faster than tract. I wonder if this is normal

cospectrum commented 8 months ago

I compared onnx with PPOCR on CPU and found that using python was 9 times faster than tract. I wonder if this is normal

onnx uses multiple threads by default. The tract is single-threaded and will of course be slower on one input. However, since tract is thread safe, I still use it in some cases.

If the task is by its nature divided into batches, for example, processing several images from a video stream, in combination with rayon there will be good speed; in my area, tract will very often be faster than onnx. Also, if, for example, the task is to write a web server, tract can process each client independently, while the onnx model will have to be wrapped in a mutex and in this case the clients will wait for each other, which will be slow under constant load.

It all depends on the subject area.