SciML / ADTypes.jl

Repository for automatic differentiation backend types
https://sciml.github.io/ADTypes.jl/
MIT License
38 stars 11 forks source link

Support FastDifferentiation.jl #34

Closed gdalle closed 6 months ago

gdalle commented 6 months ago

Checklist

Additional context

FastDifferentiation.jl is an efficient symbolic backend which DifferentiationInterface.jl can use

cc @brianguenter, do you think there should be other parameters here to specify how the backend is configured? similar to e.g. chunk size in ForwardDiff or compiled mode toggle in ReverseDiff

gdalle commented 6 months ago

@Vaibhavdixit02 let's maybe wait until @brianguenter weighs in on the content of the new struct before registering v0.2.8?

brianguenter commented 6 months ago

@Vaibhavdixit02 @gdalle I'm happy to take a look. Which struct are you referring to?

brianguenter commented 6 months ago

Currently FastDifferentiation has three distinct phases that are visible to the end user: derivative graph analysis (to the user this looks like symbolic derivative calculation), code generation, and code execution.

I plan to make a new version that will collapse these into one. The system will do something similar to trace compilation.

In this system there might be parameters associated with the number of conditionals the compiler will trace through, or the number of traces that will be cached. How would that fit into your framework? Would those parameters go in this struct?

gdalle commented 6 months ago

Yes, but if those don't exist for now we can leave the struct empty at the moment

brianguenter commented 6 months ago

Should code generation parameters go in this struct? These are the code generation parameters currently available:

How about sparsity? Should this be specified here as well?

Vaibhavdixit02 commented 6 months ago

Do you mean like a boolean or are there choices for how the sparsity detection is done?

Also note that AutoSparseFastDifferentiation exists

gdalle commented 6 months ago

I think for sparsity the easiest is to split the backend in two, sparse and not sparse, as was done for other backends.

For in-place vs out-of-place, in the DifferentiationInterface framework the same backend object can be used in both ways, so I would leave it out.

For incrementation vs zeroing, I think the default will be zeroing so again I would leave it out.