Closed eezhang123 closed 1 year ago
根据ReadME拉了镜像,启动了容器(不需要再安装trt了吧) 跑例子 run_inference 报错 https://github.com/openppl-public/ppq/blob/master/md_doc/deploy_trt_by_OnnxParser.md
Traceback (most recent call last): File "trt_inference.py", line 77, in inputs_alloc_buf, outputs_alloc_buf, bindings_alloc_buf, stream_alloc_buf = alloc_buf_N(engine,data) File "trt_inference.py", line 20, in alloc_buf_N stream = cuda.Stream() pycuda._driver.LogicError: explicit_context_dependent failed: invalid device context - no currently active context?
解决了 pycuda需要初始化,加一行import pycuda.autoinit
哦豁,完蛋
根据ReadME拉了镜像,启动了容器(不需要再安装trt了吧) 跑例子 run_inference 报错 https://github.com/openppl-public/ppq/blob/master/md_doc/deploy_trt_by_OnnxParser.md
Traceback (most recent call last): File "trt_inference.py", line 77, in
inputs_alloc_buf, outputs_alloc_buf, bindings_alloc_buf, stream_alloc_buf = alloc_buf_N(engine,data)
File "trt_inference.py", line 20, in alloc_buf_N
stream = cuda.Stream()
pycuda._driver.LogicError: explicit_context_dependent failed: invalid device context - no currently active context?