Closed lorenzoh closed 2 years ago
Apparently this fails on 0.12 due to changes necessary to Embedding
s for 0.13. @darsnack do you know what could have caused this and if this means that 0.13 changes to Embedding
are not forward compatible?
This isn't because of Embedding
. Parallel(connection, layer)
(i.e. a single branch) doesn't wrap the single layer in a tuple. So instead of iterating the tuple as expected, we are iterating the branch itself. On v0.13 this should not be the case.
Okay this is cause https://github.com/FluxML/Flux.jl/pull/1862 added a type restriction to Parallel
that should have probably always been there.
You could either remove 0.12 from the compat, special case 0.12 in the source, or on L50 of models.jl
, you can do Tuple(catbackbone[2].layers)
in the mapreduce
.
Thanks! I'll try the forward-compatible fix so that FastAI.jl can be used with both 0.12 and 0.13
Closes #201.
Depends on Flux 0.13 release and https://github.com/FluxML/FluxTraining.jl/pull/103