Closed trontrytel closed 3 months ago
The NN aerosol activation emulator training was broken in the longruns because the API of MLJFlux package have changed:
MLJFlux
ArgumentError: Flux.jl optimiser detected. Only optimisers from Optimisers.jl are supported.
I tried updating to using Adam optimiser from the Optimisers package as they want us to do. However I then get an error:
Adam
Optimisers
ERROR: Optimisers.jl cannot be used with Zygote.jl's implicit gradients, Params & Grads
At least for now I'm just pinning the MLJFlux package at the older version.
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 97.01%. Comparing base (f73555a) to head (1973e64). Report is 1 commits behind head on main.
f73555a
1973e64
The NN aerosol activation emulator training was broken in the longruns because the API of
MLJFlux
package have changed:ArgumentError: Flux.jl optimiser detected. Only optimisers from Optimisers.jl are supported.
I tried updating to using
Adam
optimiser from theOptimisers
package as they want us to do. However I then get an error:ERROR: Optimisers.jl cannot be used with Zygote.jl's implicit gradients, Params & Grads
At least for now I'm just pinning the
MLJFlux
package at the older version.