FluxML / FluxTraining.jl

A flexible neural net training library inspired by fast.ai
https://fluxml.ai/FluxTraining.jl
MIT License
117 stars 25 forks source link

Add support for Optimisers.jl #114

Closed lorenzoh closed 2 years ago

lorenzoh commented 2 years ago

Closes #112 (once done). @ToucheSir @darsnack

So this is a first draft for adding Optimisers.jl support (new optims) while keeping compatibility with optimisers in Flux.Optimise (old optims).

Passing in new optims already works, but I've broken support for old optims. Before, FluxTraining.jl was using implicit parameters with Params and Grads objects. I'm not sure how to use the old optims with explicit parameters to gradient.

I'll leave some more questions next to the code changes, some feedback from you two would be much appreciated!

lorenzoh commented 2 years ago

@darsnack I added the dispatch for gradient. Can you take a quick look that it looks okay before I merge?