Closed lbenet closed 5 years ago
I think it would be better to just use IntervalOptimisation.jl
, since you don't actually need either the derivative, nor all roots of it.
Good point; could you add a new method for that? Have you tested IntervalOptimization.jl
with the (very simple) function f(x) = 1 - x^4 + x^5
over the interval 0 .. 1
?
julia> using IntervalArithmetic, IntervalOptimisation
julia> f(x) = 1 - x^4 + x^5
f (generic function with 1 method)
julia> @time minimise(f, 0..1, tol=1e-3)
0.072655 seconds (202.39 k allocations: 10.924 MiB)
([0.916034, 0.918081], Interval{Float64}[[0.802542, 0.803535], [0.801551, 0.802543], [0.798575, 0.799568], [0.797584, 0.798576], [0.800574, 0.801552], [0.795616, 0.796609], [0.796608, 0.797585], [0.79464, 0.795617], [0.809517, 0.810479], [0.811439, 0.812416] … [0.766264, 0.767241], [0.823185, 0.823686], [0.823685, 0.824194], [0.765303, 0.766265], [0.776573, 0.777081], [0.764326, 0.765304], [0.776073, 0.776574], [0.763365, 0.764327], [0.762404, 0.763366], [0.840061, 0.841023]])
julia> @time minimise(f, 0..1, tol=1e-6)
2.480023 seconds (12.10 M allocations: 1.233 GiB, 4.44% gc time)
([0.918077, 0.918081], Interval{Float64}[[0.799997, 0.799999], [0.799996, 0.799998], [0.80001, 0.800012], [0.799992, 0.799994], [0.800014, 0.800016], [0.79999, 0.799992], [0.800016, 0.800018], [0.800021, 0.800023], [0.799982, 0.799984], [0.800023, 0.800025] … [0.801227, 0.801229], [0.798753, 0.798755], [0.801229, 0.801231], [0.80121, 0.801212], [0.798769, 0.798771], [0.798749, 0.798751], [0.801233, 0.801235], [0.798767, 0.798769], [0.801225, 0.801227], [0.798766, 0.798768]])
Ouch. This should be much faster with the mean-value form.
But I guess you don't really need high accuracy for range bounding.
maximise
is broken on the released version, but
julia> @time minimise(x->-f(x), 0..1, tol=1e-6)
0.098266 seconds (283.01 k allocations: 14.670 MiB, 7.30% gc time)
([-1.00001, -0.999999], Interval{Float64}[[0.999999, 1], [0.999998, 1], [0.999998, 0.999999], [0.999997, 0.999999], [5.37927e-05, 5.46881e-05], [7.95874e-05, 8.0469e-05], [8.13503e-05, 8.22458e-05], [0.000102234, 0.00010313], [0.000100471, 0.000101353], [9.96036e-05, 0.000100472] … [0.000111906, 0.000112748], [0.000115297, 0.000116152], [0.000121291, 0.00012216], [0.000119569, 0.000120438], [0.00011786, 0.000118715], [0.000117019, 0.000117861], [0.000118714, 0.00011957], [0.000120437, 0.000121292], [0.000113601, 0.000114444], [0.000114443, 0.000115298]])
This is what I get with bound_taylor1
:
julia> f(x) = 1 - x^4 + x^5
f (generic function with 1 method)
julia> TaylorModels.bound_taylor1( f(TaylorModel1(5, 0..0, 0..1)) )
[0.918079, 1]
julia> @time TaylorModels.bound_taylor1( f(TaylorModel1(5, 0..0, 0..1)) )
0.000160 seconds (1.23 k allocations: 86.391 KiB)
[0.918079, 1]
The actual range of the function (obtained analytically) is: (1 - 4^4/5^5) .. 1
.
Wow, how is that so fast...?
Taylor models 😄
Ha. But this is using IntervalRootFinding.jl
? Why is that so fast?
I guess it is the fact that we have polynomials.
... and in this case, there is only a local minimum, and the extrema of the domain yield the maximum.
@dpsanders Do you agree to have this merged?
Mmmmm... hold on, I just noticed certain inconsistency...
I'll go ahead and merge this; the inconsistency is related to other stuff...
This PR re-enables
bound_taylor1
which defines methods to bound the range ofTaylorModel1
(orTaylor1
polynomial) on a given interval. It uses information of the derivative, and exploitsroots
fromIntervalRootFinding
. It is robust when it works, but it may not work always. The tests include an "uncomfortable" example included in some papers of Berz and Makino, and usingbound_taylor1
almost-tight bounds are obtained.