Open Rashmip-nd opened 2 months ago
Have you serialized the NumPy arrays properly using JSON-compatible formats, i.e. like:
input_data = np.random.rand(1, 3, 384, 640).astype(np.float32)
input_data_list = input_data.tolist()
with open("custom.json", "w") as f:
json.dump({"input": input_data_list}, f)
?
Do you have modules like numpy
, json
installed locally?
If yes, are you willing to share your custom.json
content?
Loading input data from custom.json [!] Could not decode serialized type: np.ndarray. This could be because a required module is missing.
Means your json has illegal type np.ndarray
, you can use .tolist() to convert np.ndarray. @Rashmip-nd
Description
I'm trying to generate a calibration cache file for post-training-quantizatio using Polygraphy. For which I created custom input json file referring to this [https://github.com/NVIDIA/TensorRT/blob/main/tools/Polygraphy/how-to/use_custom_input_data.md]. The input shape of the model is (1,3,384,640).
The command used is below -
polygraphy convert model.onnx --int8 --load-inputs custom.json --calibration-cache custom_calib.cache -o model_trt.engine [I] Loading input data from custom.json [!] Could not decode serialized type: np.ndarray. This could be because a required module is missing.
Environment
TensorRT Version: 10.0.1.6-1
NVIDIA GPU: Tesla T4
NVIDIA Driver Version: 470.239.06
CUDA Version: 11.4
CUDNN Version:
Operating System: Ubuntu 20.04.6 LTS
Python Version (if applicable): 3.8.10
Tensorflow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if so, version):
Relevant Files
Model link:
Steps To Reproduce
Commands or scripts: polygraphy convert model.onnx --int8 --load-inputs custom.json --calibration-cache custom_calib.cache -o model_trt.engine
Have you tried the latest release?:
Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (
polygraphy run <model.onnx> --onnxrt
): Yes