Closed SebastianM-C closed 7 months ago
I'm having trouble with this bug, but getu
and observed
are identical (the former calls the latter behind the scenes if the symbol isn't a direct index into the state vector). The case you've found where observed
is slowed down after calling getu
also works the other way around: if getu
is called first and then observed
, obs2
is slower if getu
is called again, f2
is slowed down:
# Using the same initial model
julia> f2 = getu(sol, s1 + s2)
julia> @btime $f2($sol, 2)
88.906 ns (5 allocations: 752 bytes)
julia> obs2 = SymbolicIndexingInterface.observed(sol, s1 + s2)
julia> @btime $obs2($sol.u[2], parameter_values($sol), $sol.t[2])
228.753 ns (6 allocations: 832 bytes)
julia> f2 = getu(sol, s1 + s2) # recreate f2
julia> @btime $f2($sol, 2)
229.476 ns (6 allocations: 832 bytes)
I think this is because of how the observed function in ModelingToolkit.jl/src/systems/diffeqs/abstractodesystem.jl
works. The first time it is called, it has to compile a function for the expression, which it then caches in a Dict{Any, Any}
. Subsequent calls to observed
with the same expression just look up that function in the cache, which slows it down. This might be because when first called, Julia knows the type of the generated function but subsequent lookups infer it as Any
.
The view in your example is wrong, and will give an incorrect result. It should be view(sol, 2, :)
The case you've found where
observed
is slowed down after callinggetu
also works the other way around
Oh, that explains why I thought it's slower.
The view in your example is wrong, and will give an incorrect result. It should be
view(sol, 2, :)
Thanks for pointing that out.
I looked at a couple of cases comparing
getu
andSII.observed
and for unknowns,getu
seems to have ideal performance (i.e. same as just direct indexing in the vector). When computing the value for expressions, it looks likeSII.observed
is faster, but for some reason if I redefine the function forSII.observed
after usinggetu
, I get the same performance asgetu
. Doesgetu
replace the cached generated function or am I missing something?I also tried this on a larger model and there the time for
SII.observed
was still around 100ns for a simple expression, but thegetu
was slower. Is the time forgetu
expected to increase with the size of the model?I was wondering if the slowdown in
getu
was related to the length of the solution, but I tested that and it's not it.Versions:
and julia 1.10