JamesYang007 / FastAD

FastAD is a C++ implementation of automatic differentiation both forward and reverse mode.
MIT License
97 stars 4 forks source link

Some general qs. about FastAD (new to C++) #104

Open CerulloE1996 opened 1 year ago

CerulloE1996 commented 1 year ago

I sent these questions in an email to Dr Yang but I thought I would post here too

I have read the paper on FastAD and I am very interested in using fastAD for my algorithm

I was wondering does FastAD work with Rcpp? If so how can I install it? I think it should be possible but I just wanted to check (i'm new to C++)

I have used "autodiff" library (https://autodiff.github.io/) however I have found it to not be much faster than numerical differentiation for my application - have you used this before? I noticed in the paper you didn't benchmark against it

Also I was wondering if it possible to compute a gradient w.r.t a std::vector filled with eigen matrices? (or any other 3D or higher dimensional structure)? or will all the parameters need to be input into a vector or matrix and then reformatted back into the container needed for the rest of the model afterwards?

Is it possible to do use fastAD just within a standard function (rather than using "int main()" etc)? Im new to C++ and have just been using functions for everything (also using it through R via Rcpp)

eddelbuettel commented 1 year ago

Please see https://github.com/eddelbuettel/rcppfastad -- in response to your StackOverflow question. It is a (truly minimal) package: We can do this as FastAD is nicely self-contained. With Eigen given via RcppEigen we just add the headers for FastAD. The package has one simple example from Black-Scholes; I have extended it to compute an additional derivative 'vega' as well as a proof of concept.