DiffEq(For)Lux.jl (aka DiffEqFlux.jl) fuses the world of differential equations with machine learning by helping users put diffeq solvers into neural networks. This package utilizes DifferentialEquations.jl, and Lux.jl as its building blocks to support research in Scientific Machine Learning, specifically neural differential equations to add physical information into traditional machine learning.
[!NOTE] We maintain backwards compatibility with Flux.jl via FromFluxAdaptor()
For information on using the package, see the stable documentation. Use the in-development documentation for the version of the documentation, which contains the unreleased features.
DiffEqFlux.jl is for implicit layer machine learning. DiffEqFlux.jl provides architectures which match the interfaces of machine learning libraries such as Flux.jl and Lux.jl to make it easy to build continuous-time machine learning layers into larger machine learning applications.
The following layer functions exist:
with high order, adaptive, implicit, GPU-accelerated, Newton-Krylov, etc. methods. For examples, please refer to the release blog post. Additional demonstrations, like neural PDEs and neural jump SDEs, can be found in this blog post (among many others!).
Do not limit yourself to the current neuralization. With this package, you can explore various ways to integrate the two methodologies:
TensorLayer
has been removed, use Boltz.Layers.TensorProductLayer
instead.Boltz.Basis
module.SplineLayer
has been removed, use Boltz.Layers.SplineLayer
instead.NeuralHamiltonianDE
has been removed, use NeuralODE
with Layers.HamiltonianNN
instead.HamiltonianNN
has been removed in favor of Layers.HamiltonianNN
.Lux
and Boltz
are updated to v1.AbstractLuxLayer
is passed we try to automatically convert it to a Lux model with FromFluxAdaptor()(model)
.Flux
is no longer re-exported from DiffEqFlux
. Instead we reexport Lux
.NeuralDAE
now allows an optional du0
as input.TensorLayer
is now a Lux Neural Network.