When trying to execute forward_pass of a multi-output model with a loss the loss calculation fails because of a call to squeeze_or_expands_to_same_rank call inside the LossWrapper, as the parameters passed are of type tuple and list instead of a singular Tensor. The function fails on the call of shape on the passed parameters:
Loss' implementation initially calls to tf.convert_to_tenzor before the squeeze call while LossWrapper omits this step. As the wrapper is executed first, losses are no longer handle multiple outputs.
When trying to execute forward_pass of a multi-output model with a loss the loss calculation fails because of a call to squeeze_or_expands_to_same_rank call inside the LossWrapper, as the parameters passed are of type tuple and list instead of a singular Tensor. The function fails on the call of shape on the passed parameters:
https://github.com/keras-team/keras/blob/bce176f7a239b32ad321e8d7a019588ee4217baa/keras/src/losses/loss.py#L109
Loss' implementation initially calls to tf.convert_to_tenzor before the squeeze call while LossWrapper omits this step. As the wrapper is executed first, losses are no longer handle multiple outputs.
https://github.com/keras-team/keras/blob/bce176f7a239b32ad321e8d7a019588ee4217baa/keras/src/losses/losses.py#L1291
There are two ways to fix this problem that shouldn't pose too big of a problem: