Closed wasupandceacar closed 2 months ago
Thanks for your contribution!
@zhupengyang @hong19860320
@zhupengyang @hong19860320
@zhupengyang @hong19860320 一直没人啊老铁
@zhupengyang @hong19860320 ?
@zhupengyang @hong19860320 你们这还有没有人维护
@hong19860320 @zhupengyang 两周了还没人回复?
@hong19860320 @zhupengyang 是没人 review 得了还是怎么了,吱个声,我看比我新的 pr 都 merge 了
建议测试一下两个线程同时跑两个OpenCL模型的情况。 隐约记得之前 @zhaoyang-star 添加这个是为了解决这个问题。
PR devices
OpenCL
PR types
Bug fixes
PR changes
Backends
Description
Fix CL_INVALID_CONTEXT error when the model does predictions in multiple threads.
Previous issues: https://github.com/PaddlePaddle/Paddle-Lite/issues/9931 https://github.com/PaddlePaddle/Paddle-Lite/issues/10267
Caused by https://github.com/PaddlePaddle/Paddle-Lite/blob/bd60e696cc8233b0ebef3b4db932923d668e8c50/lite/backends/opencl/cl_runtime.cc#L27
Use
local_thread
here will lead to different opencl contexts in different threads, as mentioned in https://github.com/PaddlePaddle/Paddle-Lite/issues/9931#issuecomment-1407544996. Also, this is not a good solution, as discussed in https://github.com/PaddlePaddle/Paddle-Lite/issues/7888#issuecomment-990679917, which will cause other bugs.New solution: Use single instance, all threads share a global
CLRuntime
.