ARM-software / armnn

Arm NN ML Software. The code here is a read-only mirror of https://review.mlplatform.org/admin/repos/ml/armnn
https://developer.arm.com/products/processors/machine-learning/arm-nn
MIT License
1.14k stars 307 forks source link

FullyConnectedLayer: TensorShape set on OutputSlot[0] does not match the inferred shape. : [1,20,2] != [1,2] #714

Closed QHenry1990 closed 1 year ago

QHenry1990 commented 1 year ago

when I use onnx model,the output dimension and elements are all right, like this:(std::cout<<"output size = "<<outputBinding.second.GetNumElements()<<std::endl;),but when I use armnn::IRuntime::CreationOptions options; there is an error : terminate called after throwing an instance of 'armnn::LayerValidationException' what(): FullyConnectedLayer: TensorShape set on OutputSlot[0] does not match the inferred shape. : [1,20,2] != [1,2]

the model is very simple by pytorch linear layer, input is [1,20,8],output is [1,20,2]

FrancisMurtagh-arm commented 1 year ago

Hi @QHenry1990,

Can you provide the Onnx model to help us reproduce the issue?

Regards, Francis.

QHenry1990 commented 1 year ago

Hi @QHenry1990,

Can you provide the Onnx model to help us reproduce the issue?

Regards, Francis.

OK, model1.zip

FrancisMurtagh-arm commented 1 year ago

Hi @QHenry1990,

I've reproduced the issue and am now looking into it.

Thanks, Francis.

matthewsloyanARM commented 1 year ago

Hi @QHenry1990,

I have created a fix for this issue here: https://review.mlplatform.org/c/ml/armnn/+/9170. Can you please apply this patch and see if it works for you?

Thanks again for letting us know about it. All the best!

Kind regards,

Matthew

QHenry1990 commented 1 year ago

thanks a lot!

matthewsloyanARM commented 1 year ago

Hi @QHenry1990,

This fix has now been merged to the main branch so you can get it there. It unfortunately won't make it into the upcoming release (23.02) but it will be in the next one (23.05).

I will close this issue now because it has been fixed. Thanks again for letting us know about it. All the best!

Kind regards,

Matthew