Open KeyKy opened 3 years ago
I get this error in Tesla T4, using the following code:
[TensorRT] ERROR: No non-int8 implementation of layer [CONVOLUTION #1] CUDA_VISIBLE_DEVICES=0 python infer.py --m resnet18 --load_ckpt /data/yangkang/q_tensorrt/exps/pytorch_exp_1/ckpt_24 --netqat --INT8QAT
However, I do not get this error in Tesla P40 and Tesla T4 with INT8PTC. Why?
do you have a reproducible script ?
I get this error in Tesla T4, using the following code:
However, I do not get this error in Tesla P40 and Tesla T4 with INT8PTC. Why?