Closed mloubout closed 2 years ago
Merging #53 (625f09a) into master (80053c1) will increase coverage by
2.16%
. The diff coverage is90.55%
.:exclamation: Current head 625f09a differs from pull request most recent head 4d02470. Consider uploading reports for the commit 4d02470 to get more accurate results
@@ Coverage Diff @@
## master #53 +/- ##
==========================================
+ Coverage 85.94% 88.11% +2.16%
==========================================
Files 31 31
Lines 2511 2330 -181
==========================================
- Hits 2158 2053 -105
+ Misses 353 277 -76
Impacted Files | Coverage Δ | |
---|---|---|
src/InvertibleNetworks.jl | 60.00% <0.00%> (ø) |
|
src/conditional_layers/conditional_layer_hint.jl | 99.18% <ø> (+4.33%) |
:arrow_up: |
...itional_layers/conditional_layer_residual_block.jl | 100.00% <ø> (+9.19%) |
:arrow_up: |
src/layers/invertible_layer_basic.jl | 93.33% <ø> (+1.12%) |
:arrow_up: |
src/layers/invertible_layer_glow.jl | 97.40% <ø> (-0.22%) |
:arrow_down: |
src/layers/invertible_layer_hint.jl | 94.31% <ø> (+1.48%) |
:arrow_up: |
src/layers/invertible_layer_hyperbolic.jl | 86.20% <ø> (+2.04%) |
:arrow_up: |
src/layers/invertible_layer_irim.jl | 98.14% <ø> (+4.70%) |
:arrow_up: |
src/layers/invertible_layer_template.jl | 0.00% <ø> (ø) |
|
src/layers/layer_affine.jl | 97.22% <ø> (+7.22%) |
:arrow_up: |
... and 19 more |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 80053c1...4d02470. Read the comment docs.
@rafaelorozco from the test log (bottm of it) seems slightly better for memory now. Looking into IRIM part and the cat/...
@rafaelorozco
Update compat and dependencies versions Cleanup activations (better perf too) Refactor base abstract and concrete types for slightly easier codebase
@grizzuti the "generalization" changed the ordering of parameter retrieval. I don't think it's an issue but may be a problem for older version saved parameters to be loaded.
Would it be possible not to change the order of parameter retrieval? As you said, may cause issue older pretrained models.
Would it be possible not to change the order of parameter retrieval? As you said, may cause issue older pretrained models.
Is kinda of a pain because right now it's done by hand on each network but there is not easy "generalized" way to get the order obtained by hand because depending on the network it makes and implicit assumption on the order in which the layers (struct's field) are used.
Would it be possible not to change the order of parameter retrieval? As you said, may cause issue older pretrained models.
Is kinda of a pain because right now it's done by hand on each network but there is not easy "generalized" way to get the order obtained by hand because depending on the network it makes and implicit assumption on the order in which the layers (struct's field) are used.
I agree that it looks much better now. Alternatively, would it be possible to create a similar function that does the operation in reverse? e.g., put_params
?
e.g., put_params?
There is a set_params!(network, theta)
already yes
@rafaelorozco
Update compat and dependencies versions Cleanup activations (better perf too) Refactor base abstract and concrete types for slightly easier codebase
@grizzuti the "generalization" changed the ordering of parameter retrieval. I don't think it's an issue but may be a problem for older version saved parameters to be loaded.