FluxML / FastAI.jl

Repository of best practices for deep learning in Julia, inspired by fastai
https://fluxml.ai/FastAI.jl
MIT License
588 stars 51 forks source link

Add Flux 0.13 compatibility #202

Closed lorenzoh closed 2 years ago

lorenzoh commented 2 years ago

Closes #201.

Depends on Flux 0.13 release and https://github.com/FluxML/FluxTraining.jl/pull/103

lorenzoh commented 2 years ago

Apparently this fails on 0.12 due to changes necessary to Embeddings for 0.13. @darsnack do you know what could have caused this and if this means that 0.13 changes to Embedding are not forward compatible?

darsnack commented 2 years ago

This isn't because of Embedding. Parallel(connection, layer) (i.e. a single branch) doesn't wrap the single layer in a tuple. So instead of iterating the tuple as expected, we are iterating the branch itself. On v0.13 this should not be the case.

darsnack commented 2 years ago

Okay this is cause https://github.com/FluxML/Flux.jl/pull/1862 added a type restriction to Parallel that should have probably always been there.

You could either remove 0.12 from the compat, special case 0.12 in the source, or on L50 of models.jl, you can do Tuple(catbackbone[2].layers) in the mapreduce.

lorenzoh commented 2 years ago

Thanks! I'll try the forward-compatible fix so that FastAI.jl can be used with both 0.12 and 0.13