Open david-macleod opened 2 years ago
The op is missing such a check and logic to exit early: https://github.com/microsoft/onnxruntime/blob/a367f0664d831d4fb2557e8d63fc09d67d7386fe/onnxruntime/core/providers/cuda/nn/conv.cc#L180.
Are you encountering this in a real world model and would you like to contribute this fix ?
This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
Describe the bug When passing tensors with a dimension of zero size e.g. (8, 1024, 0) to BatchNorm1d we hit the following error
This is not an issue for the CPU EP and should be supported according to the ONNX spec
Thank you
System information
To Reproduce
Expected behavior A successful inference pass, as demonstrated with the CPU EP