Hi,
I trained a Gluonts TFT model with the MXNet package. When I use it to make forecasts, it works great. My goal is to export this model to ONNX so that I can import it in MATLAB.
To export to ONNX, I first need to serialize the model, and then I can use the prediction_net-0000.params and prediction_net-symbol.json file to export to ONNX. I'm not sure if the problem is with the serialization step, or with the model export.
After the training, I have a predictor that is a gluonts.mx.model.predictor.RepresentableBlockPredictor object. I create the .json and .params files with:
p = predictor.as_symbol_block_predictor(dataset=training_data)
p.serialize_prediction_net(path=Path(os.path.join(save_path, onnx_dir)))
The first line gives me this warning. I'm not sure if it indicates a problem, but the second line indeed creates the .params and .json files.
[...]\mxnet\gluon\block.py:1512: UserWarning: Cannot decide type for the following arguments. Consider providing them as input:
data0: None
input_sym_arg_type = in_param.infer_type()[0]
After, to export to ONNX, here's the lines of code:
It raises an error with the axis related to the _temporalfusiontransformerpredictionnetwork0_variableselectionnetwork1_gatedresidualnetwork1_layernorm0layernorm0 located in the prediction_net-symbol.json file. I'm not sure if it's a problem with the serialization, or the ONNX export. As suggests the first line of the error message, there is a problem with the _inputshape parameter. The model takes as input a (100,) time series vector in a PandasDataset. I have no feature. What is the input shape of such type of model? I can't find similar situation on internet. Is it (batch_size, sequence_length, num_features)?
The parameters.json file created with the p.serialize_prediction_net() function has the following information:
Description
Hi, I trained a Gluonts TFT model with the MXNet package. When I use it to make forecasts, it works great. My goal is to export this model to ONNX so that I can import it in MATLAB. To export to ONNX, I first need to serialize the model, and then I can use the prediction_net-0000.params and prediction_net-symbol.json file to export to ONNX. I'm not sure if the problem is with the serialization step, or with the model export. After the training, I have a predictor that is a gluonts.mx.model.predictor.RepresentableBlockPredictor object. I create the .json and .params files with:
The first line gives me this warning. I'm not sure if it indicates a problem, but the second line indeed creates the .params and .json files.
After, to export to ONNX, here's the lines of code:
Here is the output and error when running the onnx_mxnet.export_model() line:
It raises an error with the axis related to the _temporalfusiontransformerpredictionnetwork0_variableselectionnetwork1_gatedresidualnetwork1_layernorm0layernorm0 located in the prediction_net-symbol.json file. I'm not sure if it's a problem with the serialization, or the ONNX export. As suggests the first line of the error message, there is a problem with the _inputshape parameter. The model takes as input a (100,) time series vector in a PandasDataset. I have no feature. What is the input shape of such type of model? I can't find similar situation on internet. Is it (batch_size, sequence_length, num_features)?
The parameters.json file created with the p.serialize_prediction_net() function has the following information:
Can we infer the model input shape from this? Thank you for your help!
Environment