Closed QHenry1990 closed 1 year ago
Hi @QHenry1990,
Can you provide the Onnx model to help us reproduce the issue?
Regards, Francis.
Hi @QHenry1990,
Can you provide the Onnx model to help us reproduce the issue?
Regards, Francis.
OK, model1.zip
Hi @QHenry1990,
I've reproduced the issue and am now looking into it.
Thanks, Francis.
Hi @QHenry1990,
I have created a fix for this issue here: https://review.mlplatform.org/c/ml/armnn/+/9170. Can you please apply this patch and see if it works for you?
Thanks again for letting us know about it. All the best!
Kind regards,
Matthew
thanks a lot!
Hi @QHenry1990,
This fix has now been merged to the main branch so you can get it there. It unfortunately won't make it into the upcoming release (23.02) but it will be in the next one (23.05).
I will close this issue now because it has been fixed. Thanks again for letting us know about it. All the best!
Kind regards,
Matthew
when I use onnx model,the output dimension and elements are all right, like this:(std::cout<<"output size = "<<outputBinding.second.GetNumElements()<<std::endl;),but when I use armnn::IRuntime::CreationOptions options; there is an error : terminate called after throwing an instance of 'armnn::LayerValidationException' what(): FullyConnectedLayer: TensorShape set on OutputSlot[0] does not match the inferred shape. : [1,20,2] != [1,2]
the model is very simple by pytorch linear layer, input is [1,20,8],output is [1,20,2]