Closed loreloc closed 3 days ago
Attention: Patch coverage is 57.47126%
with 37 lines
in your changes missing coverage. Please review.
Project coverage is 68.25%. Comparing base (
3e51b42
) to head (9a48049
).
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
As we discussed and to avoid any confusion, this merges dense and mixing layers into a more general sum layer, as formalized in https://arxiv.org/abs/2409.07953.
Moreover, this adds sanity checks on the symbolic representation (symbolic circuit, parameterizations and initializations), thus preventing some bugs (e.g. shape mismatch) to happen after compilation and optimizations.
Finally, as I was already changing the implementation of the sum layer in torch, this PR also fixes the bug causing sum layers to output NaNs when the input to a sum unit are all +-infs.
Closes #309 #316 #319
Detailed changes:
309 Replaced symbolic mixing and dense layers with a symbolic sum layer.
use_mixing_weights
boolean flag in image_data template (see here), as to enable a parameterization of sum layers with arity > 1 that matches the one we had with mixing layers. This defaults to True for compatibility with previous results.nary_sum_weight_factory
in circuit from region graph method as to optionally enable the specification of a different weight factory for sum layers with arity > 1.316 Add a number of sanity checks to the symbolic representation, such as on the graph structures and on the input/output shapes
319 Fixed bug causing sum layers to output NaN when entries being log-sum-expd are all infs. This has been done by clamping infs values to be finite (based on the chosen precision).