keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.93k stars 19.46k forks source link

Loss calculation fails with multiple outputs due to the LossWrapper #20373

Open markomitos opened 1 week ago

markomitos commented 1 week ago

When trying to execute forward_pass of a multi-output model with a loss the loss calculation fails because of a call to squeeze_or_expands_to_same_rank call inside the LossWrapper, as the parameters passed are of type tuple and list instead of a singular Tensor. The function fails on the call of shape on the passed parameters:

https://github.com/keras-team/keras/blob/bce176f7a239b32ad321e8d7a019588ee4217baa/keras/src/losses/loss.py#L109

Loss' implementation initially calls to tf.convert_to_tenzor before the squeeze call while LossWrapper omits this step. As the wrapper is executed first, losses are no longer handle multiple outputs.

https://github.com/keras-team/keras/blob/bce176f7a239b32ad321e8d7a019588ee4217baa/keras/src/losses/losses.py#L1291

There are two ways to fix this problem that shouldn't pose too big of a problem:

nicolaspi commented 1 day ago

This is potentially fixed by https://github.com/keras-team/keras/pull/20358 as the loss wrapper now maps squeeze_or_expands_to_same_rank on the structure.