Everything else has stayed the same. Why is that? Or am I doing something wrong?
Full code to reproduce this issue is
using DataDrivenDiffEq
using LinearAlgebra
using ModelingToolkit
using OrdinaryDiffEq
function michaelis_menten(u, p, t)
[0.6 - 1.5u[1] / (0.3 + u[1])]
end
u0 = [0.5]
problem_1 = ODEProblem(michaelis_menten, u0, (0.0, 4.0));
solution_1 = solve(problem_1, Tsit5(), saveat=0.1);
problem_2 = ODEProblem(michaelis_menten, 2 * u0, (4.0, 8.0));
solution_2 = solve(problem_2, Tsit5(), saveat=0.1);
problem_3 = ODEProblem(michaelis_menten, 3 * u0, (8.0, 12.0));
solution_3 = solve(problem_3, Tsit5(), saveat=0.1);
function michaelis_menten(X::AbstractMatrix, p, t::AbstractVector)
reduce(hcat, map((x, ti) -> michaelis_menten(x, p, ti), eachcol(X), t))
end
data = (
Experiment_1=(X=Array(solution_1), t=solution_1.t, DX=michaelis_menten(Array(solution_1), [], solution_1.t)),
Experiment_2=(X=Array(solution_2), t=solution_2.t, DX=michaelis_menten(Array(solution_2), [], solution_2.t)),
Experiment_3=(X=Array(solution_3), t=solution_3.t, DX=michaelis_menten(Array(solution_3), [], solution_3.t))
)
prob = DataDrivenDiffEq.ContinuousDataset(data);
@parameters t
D = Differential(t)
@variables u[1:1](t)
h = [monomial_basis(u[1:1], 4)...]
basis = Basis([h; h .* D(u[1])], u, implicits=D.(u), iv=t)
println(basis) # hide
sampler = DataSampler(
Split(ratio=0.8), Batcher(n=10)
)
opt = ImplicitOptimizer(1e-1:1e-1:5e-1)
res = solve(prob, basis, opt, normalize=false, denoise=false, by=:min, sampler=sampler, maxiter=1000);
From what I tried, I am not sure, if the function (from set.jl)
function (b::AbstractBasis)(dx::AbstractMatrix, d::DataDrivenDataset)
last = 1
@views for (i,s) in enumerate(cumsum(d.sizes))
b(dx[:, last:s], d.probs[i])
last += s
end
return
end
is really correct. The line last += s seems strange to me, as s is given by a cumsum... Shouldn't it be last = s+1?
For my Bachelor's thesis I need to use Datasets, so I was playing around with Michaelis Menten example.
By default, it features 2 subproblems and everything works fine. But if we add a third subproblem, for example
and
Then the solver will crash and throw this error
Everything else has stayed the same. Why is that? Or am I doing something wrong?
Full code to reproduce this issue is
From what I tried, I am not sure, if the function (from set.jl)
is really correct. The line
last += s
seems strange to me, ass
is given by acumsum
... Shouldn't it belast = s+1
?