Which results in
neural_cast.frontend.exceptions.CompilerException.CompilerException: Error: incompatible input broadcasting in Add operator. Shape 2 does not fit in shape 1 for broadcasting.
When an Add Node is after a MatMul Node, which occur in typical Linear layer.
ONNX MatMul relies on numpy semantics
https://github.com/alecerio/NeuralCasting/blob/main/neural_cast/frontend/parser/ops/matmul.py
def infer_output_shape(self) -> list[list[int]]
does not implement https://numpy.org/doc/stable/reference/generated/numpy.matmul.html: If the first argument is 1-D, it is promoted to a matrix by prepending a 1 to its dimensions. After matrix multiplication the prepended 1 is removed. If the second argument is 1-D, it is promoted to a matrix by appending a 1 to its dimensions. After matrix multiplication the appended 1 is removed.Which results in neural_cast.frontend.exceptions.CompilerException.CompilerException: Error: incompatible input broadcasting in Add operator. Shape 2 does not fit in shape 1 for broadcasting.
When an Add Node is after a MatMul Node, which occur in typical Linear layer.