Open fredrik-johansson opened 5 years ago
At least for the univariate case, using equidistant nodes on a circle around the point seems to be (near-)optimal. In particular, for N ≥ n+1,
1/z^(n+1) ~ z^(N-n-1)/(z^N - r^N) = sum_w 1/(N * r^n * w^n * (z - r*w))
and thus
f^(n) (x) ~ n! * sum_w 1/(N * r^n * w^n) * f(x + r*w)
,
where sum_w is the sum over the N-th roots of unity.
The error (... - f^(n) (x)) is found as a contour integral over
n! / (2πi) * r^N / (z^(n+1) * (z^N - r^N)) * f(z)
,
so e.g. if f is bounded on the circle of radius R around x via |f| ≤ M, then
|Error| ≤ n! M * r^N / (R^n (R^N - r^N))
This should work around most of the stability problems (the second bullet point in the previous post).
In theory, the above idea can be used for the multivariate case by a simple product construction, but the number of nodes increases very quickly and it's also suboptimal. This can be understood geometrically by mapping monomials to lattice points, e.g. x^3 y^5 z^2 to (3,5,2). The "exact" monomials from above form a cuboid, but a simplex would likely be better.
There should be some functionality to compute the (multivariate) truncated Taylor series of a given (multivariate) complex analytic function.
Thoughts on this so far:
Need error bounds for finite differences. This should be possible via the multivariate Cauchy integral formula on a polydisc. In most cases, we could just evaluate the integrand once on a box containing the polydisc. In other cases, we might want to compute an adaptive bound over the boundary of the polydisc, to get a tighter bound or to bypass a removable singularity at the differentiation point (trivial example: f(x) = x / x at x = 0).
Using finite differences, we need (n+1)-fold precision (somewhat lower with a central difference?) for an nth derivative. In particular, the input variables must be exact (or already accurate to (n+1)-fold precision). Workaround: compute at the midpoints, and use complex magnitudes also to bound the perturbation.
It would be easiest to deal with the univariate case first, but multivariates are just as important. In fact, a special case is to compute a univariate derivative of a function depending on some parameters, say f(a,b,x). If we want to compute an nth derivative with respect to x and (a, b) are given by inexact balls, we could take the midpoints of (a, b, x) and compute the (0,0,n)-order derivative, adding the perturbation bounds for all variables.
Using complex integration, the existing numerical integration code could be used, but it is probably better in most cases to use the trapezoidal rule. As with finite differences, there's the practical matter of extending this to several variables and evaluating on a big box vs adaptively on an annulus.