slimgroup / InvertibleNetworks.jl

A Julia framework for invertible neural networks
MIT License
148 stars 20 forks source link

LearnableSqueezer #94

Open grizzuti opened 9 months ago

grizzuti commented 9 months ago

Added a "learnable squeezer" layer, typically used in invertible U-nets (see Etmann, et al., 2020, https://arxiv.org/abs/2005.05220).

Some other very minor changes, the most important of which is having removed the type "InvertibleLayer" and changed it to "InvertibleNetwork". Didn't really see a need for having a separate "InvertibleLayer" type.

codecov[bot] commented 9 months ago

Codecov Report

Attention: 24 lines in your changes are missing coverage. Please review.

Files Coverage Δ
src/InvertibleNetworks.jl 60.00% <ø> (ø)
src/conditional_layers/conditional_layer_glow.jl 88.63% <ø> (ø)
src/conditional_layers/conditional_layer_hint.jl 99.18% <ø> (ø)
...itional_layers/conditional_layer_residual_block.jl 100.00% <ø> (ø)
src/layers/invertible_layer_actnorm.jl 90.90% <ø> (ø)
src/layers/invertible_layer_basic.jl 93.33% <ø> (ø)
src/layers/invertible_layer_conv1x1.jl 89.47% <100.00%> (ø)
src/layers/invertible_layer_glow.jl 96.34% <ø> (ø)
src/layers/invertible_layer_hyperbolic.jl 86.20% <ø> (ø)
src/layers/invertible_layer_irim.jl 98.11% <ø> (ø)
... and 12 more

:loudspeaker: Thoughts on this report? Let us know!.

rafaelorozco commented 3 months ago

@grizzuti in my experience sometimes the random seed for the gradient test can be finicky so I have set the tests to rerun a few times before calling it a failure. I think this particularily relevant if the test fails on a single Julia version. I will try this on this branch and if that 1.6 version passes we should merge