Without this, TRT complains that multiple weights with the same value but different counts are added to the graph (using the dynamo based ONNX export of pytorch 2):
[TRT] [E] [network.cpp::setWeightsName::3797] Error Code 1: Internal Error (Error: Weights of same values but of different counts are used in the network!)
It seems that some of the weights go out of scope and their addresses get reused.
I fixed this by copying the value to tempWeights which will not go out of scope, but I do not know if this is the proper fix.
Without this, TRT complains that multiple weights with the same value but different counts are added to the graph (using the dynamo based ONNX export of pytorch 2):
[TRT] [E] [network.cpp::setWeightsName::3797] Error Code 1: Internal Error (Error: Weights of same values but of different counts are used in the network!)
It seems that some of the weights go out of scope and their addresses get reused.
I fixed this by copying the value to tempWeights which will not go out of scope, but I do not know if this is the proper fix.