slimgroup / InvertibleNetworks.jl

A Julia framework for invertible neural networks
MIT License
148 stars 20 forks source link

Dense glow network #77

Closed rafaelorozco closed 1 year ago

rafaelorozco commented 1 year ago

Dense glow network Working for 1dim inputs, channel doesnt need to be divisible by 2. x = (nx,nc,batch) nc can be anything but if nc=1 then split_scales needs to be turned on to get extra channels to do the affine couplying layer

@mloubout from what I understand we need to feed in the nx dimension to define a dense layer right? I guess that was the magic of conv nets, that they only needed the channel dimension.

codecov[bot] commented 1 year ago

Codecov Report

Patch coverage: 95.83% and project coverage change: +0.13 :tada:

Comparison is base (df92d43) 88.54% compared to head (ae9513b) 88.68%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #77 +/- ## ========================================== + Coverage 88.54% 88.68% +0.13% ========================================== Files 33 33 Lines 2550 2571 +21 ========================================== + Hits 2258 2280 +22 + Misses 292 291 -1 ``` | [Impacted Files](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/77?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup) | Coverage Δ | | |---|---|---| | [src/InvertibleNetworks.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/77?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL0ludmVydGlibGVOZXR3b3Jrcy5qbA==) | `60.00% <ø> (ø)` | | | [src/layers/invertible\_layer\_actnorm.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/77?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL2xheWVycy9pbnZlcnRpYmxlX2xheWVyX2FjdG5vcm0uamw=) | `96.59% <66.66%> (+0.12%)` | :arrow_up: | | [src/utils/dimensionality\_operations.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/77?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL3V0aWxzL2RpbWVuc2lvbmFsaXR5X29wZXJhdGlvbnMuamw=) | `95.78% <85.71%> (+0.09%)` | :arrow_up: | | [src/layers/invertible\_layer\_glow.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/77?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL2xheWVycy9pbnZlcnRpYmxlX2xheWVyX2dsb3cuamw=) | `98.78% <100.00%> (+1.48%)` | :arrow_up: | | [src/networks/invertible\_network\_glow.jl](https://app.codecov.io/gh/slimgroup/InvertibleNetworks.jl/pull/77?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=slimgroup#diff-c3JjL25ldHdvcmtzL2ludmVydGlibGVfbmV0d29ya19nbG93Lmps) | `90.51% <100.00%> (+0.51%)` | :arrow_up: |

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

rafaelorozco commented 1 year ago

@mloubout what do you think about merging this? I am mostly interested in the nx,nc,nbatch inputs because it looks like the (1,1,nx,nbatch) tensors is not natural for a lot of people. The dense nets could use help (want to give users easier options for swapping them out) but could be in a future PR.

mloubout commented 1 year ago

That's fine just rebase it