PaddlePaddle / Anakin

High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.
https://anakin.baidu.com/
Apache License 2.0
532 stars 135 forks source link

Does it have the multi-thread test ? #504

Open Rpersie opened 5 years ago

Rpersie commented 5 years ago

In my task, I need not only do network computations on gpu but also do complex post processing on cpu. So I need to create multithreads on cpu and push the data to gpu individually by every thread. But I have counter a big problem. This architecture can not support many threads on cpu. For example, the machine can run 56 threads on cpu . But the cpu-gpu mix architecture can only run 8 threads . Does this project have some multithread examples ? Or some advices about how to use it in multithread . Thank you !

2015-10-10 commented 5 years ago

Anakin has updated the multi-thread test in docs/Manual/Anakin_helper_ch.md, and will push to GitHub very soon.