Closed AayushSabharwal closed 3 months ago
Not splitting into Tuple{Vector{T1}, Vector{T2}} also has performance benefits for sensitivity analysis, and helps reduce a significant number of bugs. Array tunables are also concatenated into this vector. Accessing these functions is done through a view to prevent unnecessary performance overhead, even in MTK's generated functions.
And not to mention, no AD or optimization library supports having the tuning vector be different types, and I don't see that changing any time soon, and I don't see a case where that's a real optimization.
Thanks for helping with any migration this will entail. Would this then mean we can’t have integer parameters that stay integer or is that not an issue as they aren’t tunable?
Also, isn’t this a breaking change at some level? I’d imagine type specific user registered functions may stop working if parameters are being silently promoted now. Will this be held for a MTK 10 release?
Note there are likely doc updates needed here too as I believe @TorkelE wrote about typing parameters for the V14 docs too.
It's not breaking if integer-valued parameters are not marked as tunable. This is just changing the internal representation of tuanbles. @AayushSabharwal confirm that @parameters p=1
has that default to non-tunable? Only AbstractFloats should default to tuanble.
Thanks for helping with any migration this will entail. Would this then mean we can’t have integer parameters that stay integer or is that not an issue as they aren’t tunable?
They will still be integers, they just won't be in the tunables portion. @parameters p::Integer = 1
will still be stored as an integer.
@parameters p=1 has that default to non-tunable? Only AbstractFloats should default to tuanble.
Only AbstractFloat
s would default to tunable yes, but @parameters p = 1
is by default a BasicSymbolic{Real}
which we treat as a float. To get an integer, you need the type annotation.
That would be breaking then, because we previously respected it as an integer there. I really think we should set tunable more carefully by default. I don't think we can go forward with this without it, since otherwise you very easily get rid of people's integers and bools.
It's not breaking. @parameters p = 1
was always a float, since v9
Parameters that users have explicitly marked as integers/bools will still be integers/bools, just in a different partition.
Oh, I thought we had a conversion. Okay, if that was already respecting the symtype then we're good.
We already assume that if a symbolic parameter isn’t explicitly annotated as an integer symbolic it gets converted so that case shouldn’t be an issue.
Did these additionally specific typings cause error in tests with new MTK? The tests here was not really intended to be optimised code for AD, but rather to test various really niche cases and that these works, to potentially catch niche errors.
With https://github.com/SciML/ModelingToolkit.jl/pull/2908, tunables are being merged into one
Vector{T}
, whereT
is some floating point number. This is because tunables are intended for optimization and sensitivity analysis, but AD doesn't work for integers. Not splitting intoTuple{Vector{T1}, Vector{T2}}
also has performance benefits for sensitivity analysis, and helps reduce a significant number of bugs. Array tunables are also concatenated into this vector. Accessing these functions is done through a view to prevent unnecessary performance overhead, even in MTK's generated functions.To this end, parameters are promoted to a common type. Overly specific type annotations are also in general invalid, since the parameters could also be duals.
This PR is a draft so it can be rerun once to aforementioned MTK PR is merged and tagged.
Checklist
Additional context
Add any other context about the problem here.