Closed kylesayrs closed 1 year ago
Hi @kylesayrs ,
Could you try the following option?
* Option key: "allow-expanded-dims" \n
* Possible values: ["true"/"false"] \n
* Description: If true will disregard dimensions with a size of 1 when validating tensor shapes but tensor
* sizes must still match. \n
* This is an Experimental parameter that is incompatible with "infer-output-shape". \n
* This parameter may be removed in a later update.
Hi @kylesayrs, did you try @ArmRyan's recommendation? Please feedback and let me know if you need further assistance. I will close this issue in the coming days otherwise. Thank you!
Hi @kylesayrs
I will close this on October 10th if I do not hear from you, please let me know if you need any more help.
Thank you, Keith
Hi there. I'm trying to run inference with a BERT model whose batch size is
1
and I run into a shape inference error. I have no problem running the model with XNNPACKThis error does not occur when the model is generated with a batch size of
None
, but that leads to another error mentioned here.I know that this issue mentions that layer norm nodes are not supported by ARMNN, however I don't believe this model graph has any nodes that are not supported (or at least this error doesn't make that evident).
Download model (this model is quantized, where as the script is non quantized. Both give the same error): https://drive.google.com/file/d/1kF2nynDj9exSU9Ensg0fsp2NaFzD8s2-/view?usp=sharing
Create inferencer
Shape inference error