slimgroup / InvertibleNetworks.jl

A Julia framework for invertible neural networks
MIT License
152 stars 23 forks source link

add scale and shifted sigmoid #38

Closed rafaelorozco closed 3 years ago

rafaelorozco commented 3 years ago

According to the theory in https://arxiv.org/pdf/2006.09347.pdf and from the success of https://arxiv.org/pdf/2007.02462.pdf we add a tunable parameter to the sigmoid activation layer which shifts and scales such that output is between [a,b] where b is chosen to be 1 and a needs to be greater than 0. Good choice of a seems to be a=0.5

codecov[bot] commented 3 years ago

Codecov Report

Merging #38 (2bd0114) into master (11e8e13) will increase coverage by 0.00%. The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master      #38   +/-   ##
=======================================
  Coverage   84.10%   84.11%           
=======================================
  Files          31       31           
  Lines        2479     2480    +1     
=======================================
+ Hits         2085     2086    +1     
  Misses        394      394           
Impacted Files Coverage Δ
src/utils/activation_functions.jl 85.50% <100.00%> (+0.21%) :arrow_up:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 11e8e13...2bd0114. Read the comment docs.