The test function was reporting a loss that was too low. The loss put a different weight on each source, relative to the order of model_config['source_names'].
For example, in the case of two sources (voice and accompaniment), the loss should be (voice_loss + accomp_loss) / 2 but instead it was voice_loss / 4 + accomp_loss / 2.
Good catch! Thanks!
Fortunately doesn't really affect training itself so one can expect the resulting models after training to not change significantly after introducing this fix.
One tab character too much. :)
The
test
function was reporting a loss that was too low. The loss put a different weight on each source, relative to the order ofmodel_config['source_names']
.For example, in the case of two sources (voice and accompaniment), the loss should be
(voice_loss + accomp_loss) / 2
but instead it wasvoice_loss / 4 + accomp_loss / 2
.(The training loss was and remains correct.)