Open jayasuryamaganuru opened 2 months ago
@umadevimcw , is this something your team can look at since it is a eltwise op? but i am not sure.
@jayasuryamaganuru , what model does this block?
@jliangTT functional whisper, currently the model is supported only for batch_size=1, while extending for batch_size > 1 , facing the issue
This fix will require both input tensors to be of same rank for broadcast
torch_input_tensor_a = torch.rand((8, h, w), dtype=torch.bfloat16)
torch_input_tensor_b = torch.rand((1, h, w), dtype=torch.bfloat16)
Describe the bug TTNN add op is not performing broadcasting as anticipated. The intention is to broadcast a 2D tensor onto a 3D tensor, but it's failing due to an unsupported broadcast error.
To Reproduce Steps to reproduce the behavior:
Save the below snippet to a file
pytest path/to/file
Expected behavior
input_tensor_b
should get added toinput_tensor_a
on all 8 channels to produce anoutput_tensor
of shape (8, h, w)Screenshots
Please complete the following environment information: