Open yuanqing-wang opened 2 years ago
IIRC currently second derivatives are not supported in auto-diff, but I'm not very familiar with the autodiff system either.
cc: @erizmr
Hi @yuanqing-wang , the current autodiff system has not support second order derivatives yet but it is on our road map.
You guys probably know this, but higher order ad is not that difficult - just make ad return source code instead of runtime construct, then do it again. For better efficiency though, one should only do one backward ad with one talyor-forward ad.
I implemented higher-order derivative in one of deep learning libraries. Taichi autodiff system might be different from typical deep learning framework. Though, I hope that higher order derivative is not so difficult to implement in Taichi since for most of functions, the grad of grad of a function is the same implementation of the given function but with different inputs.
Greetings! I'm new to this community so my apologies if this question is somewhat naive. But is there a (recommended) way to take derivatives of derivatives in
taichi
? Thanks!