FluxML / MLJFlux.jl

Wrapping deep learning models from the package Flux.jl for use in the MLJ.jl toolbox
http://fluxml.ai/MLJFlux.jl/
MIT License
145 stars 17 forks source link

Tidy up #192

Closed ablaom closed 2 years ago

ablaom commented 2 years ago

This PR tidies up some implementation details and renames a few files for clarity.

When implementing regularisation penalties, we combined the bare loss and the penalty into a single function penalized_loss(x, y) but a more flexible arrangement is to keep the two separate as long as possible. This is the main change of substance in the PR.

A use case in my research was the impetus for making this change.

ablaom commented 2 years ago

The build kite fails are only code coverage upload fails.