Closed idealboy closed 4 years ago
Expect a major update in the near future. TRT 7.0 and 7.1 have some pretty significant changes. This includes new behavior for the IExecutionContext that is not back compatible with trtlab.
The new trtlab will bring a lot of updates to memory management as well a custom allocators. I hope to have it out in a few weeks.
Hi, sir,
Dose this version support TensorRT-7.0? I found it will have an error:
F0518 18:01:28.024194 7095 runtime.cc:92] [TensorRT.ERROR]: ../rtSafe/cuda/caskConvolutionRunner.cpp (233) - Cuda Error in allocateContextResources: 1 (invalid argument) Check failure stack trace: @ 0x7f23a3dcce52 google::LogMessage::Fail() @ 0x7f23a3dccd9a google::LogMessage::SendToLog() @ 0x7f23a3dcc6db google::LogMessage::Flush() @ 0x7f23a3dcfb56 google::LogMessageFatal::~LogMessageFatal() @ 0x559f495c24c2 trtlab::TensorRT::Runtime::Logger::log() @ 0x7f2394b1b44a nvinfer1::LogStream::Buf::sync() @ 0x7f239438dd7e std::ostream::flush() @ 0x7f2394b1bd5a nvinfer1::throwCudaError() @ 0x7f2394b2241a nvinfer1::rt::task::CaskConvolutionRunner::allocateContextResources() @ 0x7f2394afc02c nvinfer1::rt::SafeExecutionContext::SafeExecutionContext() @ 0x7f23948b7581 nvinfer1::rt::ExecutionContext::ExecutionContext() @ 0x7f23948b7a87 nvinfer1::rt::Engine::createExecutionContextWithoutDeviceMemory() @ 0x559f495bb038 trtlab::TensorRT::Model::CreateExecutionContext() @ 0x559f49590f34 trtlab::TensorRT::InferenceManager::RegisterModel() @ 0x559f49590806 trtlab::TensorRT::InferenceManager::RegisterModel() @ 0x559f49582ac5 main @ 0x7f23938f5b97 __libc_start_main @ 0x559f4958231a _start @ (nil) (unknown)
when I do inference in 20.3-py3 docker image, using examples/00_TensorRT, thank you very much!