muslll / neosr

neosr is a framework for training real-world single-image super-resolution networks.
https://github.com/muslll/neosr
Apache License 2.0
136 stars 28 forks source link

Accumulate losses #47

Closed terrainer closed 6 months ago

terrainer commented 6 months ago

The generator losses can be accumulated into l_g_total and have a single scale.backward() run on them, as was done until recently. Lower-level gradients do not get overwritten, and scale only needs to be run on each loss individually if they are not summed.

muslll commented 6 months ago

Solved on 6213a72. Thanks for your comments, it helped a lot.