Closed rafaelorozco closed 1 year ago
Merging #59 (25e999f) into master (a068423) will increase coverage by
0.18%
. The diff coverage is90.22%
.
@@ Coverage Diff @@
## master #59 +/- ##
==========================================
+ Coverage 87.88% 88.06% +0.18%
==========================================
Files 31 33 +2
Lines 2327 2439 +112
==========================================
+ Hits 2045 2148 +103
- Misses 282 291 +9
Impacted Files | Coverage Δ | |
---|---|---|
src/InvertibleNetworks.jl | 60.00% <ø> (ø) |
|
...rc/networks/invertible_network_conditional_glow.jl | 82.08% <82.08%> (ø) |
|
src/conditional_layers/conditional_layer_glow.jl | 97.72% <97.72%> (ø) |
|
src/layers/layer_residual_block.jl | 98.70% <100.00%> (+0.01%) |
:arrow_up: |
src/utils/activation_functions.jl | 89.23% <0.00%> (+6.15%) |
:arrow_up: |
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
Simple conditional network based on glow coupling layer and the cIIN idea of tensor catting the condition to the input of the residual block.
Includes an important change to the residual block that allows for user set the size of the output of the RB with keyword argument n_out:
This change should be backwards compatible.
I want to use this network to showcase the MNIST non-conditional and conditional generation. This network doesn't necessarily perform better than conditional hint (same performance as far as my tests) but is much more lightweight.