slimgroup / InvertibleNetworks.jl

A Julia framework for invertible neural networks
MIT License
148 stars 20 forks source link

Summ net #82

Closed rafaelorozco closed 1 year ago

rafaelorozco commented 1 year ago

First try at adding a summary network to conditional glow.

I could use some extra eyes on how to implement this cleanly. Although it gets the job done and works with arbitrary flux layers, it feel a little bit clunky.

codecov[bot] commented 1 year ago

Codecov Report

Patch coverage: 96.00% and project coverage change: +0.05 :tada:

Comparison is base (671e9c1) 88.47% compared to head (7fb07d9) 88.52%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #82 +/- ## ========================================== + Coverage 88.47% 88.52% +0.05% ========================================== Files 33 34 +1 Lines 2577 2589 +12 ========================================== + Hits 2280 2292 +12 Misses 297 297 ``` | [Impacted Files](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/82?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup) | Coverage Δ | | |---|---|---| | [src/InvertibleNetworks.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/82?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL0ludmVydGlibGVOZXR3b3Jrcy5qbA==) | `60.00% <ø> (ø)` | | | [src/layers/layer\_resnet.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/82?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL2xheWVycy9sYXllcl9yZXNuZXQuamw=) | `93.75% <80.00%> (+0.41%)` | :arrow_up: | | [...rc/networks/invertible\_network\_conditional\_glow.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/82?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL25ldHdvcmtzL2ludmVydGlibGVfbmV0d29ya19jb25kaXRpb25hbF9nbG93Lmps) | `96.05% <100.00%> (ø)` | | | [src/networks/summarized\_net.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/82?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL25ldHdvcmtzL3N1bW1hcml6ZWRfbmV0Lmps) | `100.00% <100.00%> (ø)` | | | [src/utils/neuralnet.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/82?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL3V0aWxzL25ldXJhbG5ldC5qbA==) | `75.47% <100.00%> (+0.96%)` | :arrow_up: |

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

rafaelorozco commented 1 year ago

How about something like this previous commit @mloubout ? you make the summarized net like this:

sum_net = ResNet(n_cond, 16, 3; norm=nothing) # make sure it doesnt have any weird normalizati8ons

# Network and input
flow = NetworkConditionalGlow(n_in, n_cond, n_hidden, L, K; ndims=length(N))
G = SummarizedNet(flow, sum_net)

And the main operations are the same as before

Y, ZCond = G.forward(X,Cond)
X_ = G.inverse(Y,ZCond) 
G.backward(Y, Y, ZCond; Y_save = Cond)