SciML / ModelingToolkit.jl

An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
https://mtk.sciml.ai/dev/
Other
1.41k stars 204 forks source link

Document specifying initial values of specific parameters in optimizing parameters #2606

Open bgctw opened 5 months ago

bgctw commented 5 months ago

In the examples of remake.md, the order of parameters in the queried parameter object is not known. This makes it difficult to provide reasonable initial values for each parameter.

Document, how the order of parameters can be queried.

ChrisRackauckas commented 5 months ago

There is no ordering to the parameters. There is no single array in which they are necessary put. It's a multi-dimensional object.

bgctw commented 5 months ago

Some more detail of my motivation: In the tutorial, the optimization is started like

optprob2 = OptimizationProblem(
    optfn2, rand(4), (odeprob, timesteps, data), lb = 0.1zeros(4), ub = 3ones(4))

However, which number in the rand(4) initialization pertains to which parameter is not clear. So how do you specify a specific value for e.g. for γ and a different magnitude or sign for δ?

If I understand Chris right, one cannot rely on the order in the marshaled vector provided by canonicalize. I had hoped, that this order at least stays constant over the life-time of the built system or the problem, so that I can infer the order beforehand and use it during optimization.

ChrisRackauckas commented 5 months ago

The ordering of the tunables are set when the system is marked complete, which if you use @mtkbuild it would be during that step. They do not change after further steps.

We probably need some tool to let people know the ordering in the tunable vector. Though I wonder if just using setu is easier.

bgctw commented 5 months ago

just using setu is easier.

I assume you mean setp. That would be super nice. However, it was ruled out because it is mutating and hence, not differentiable.

bgctw commented 5 months ago

Could a non-mutating version of setp be implemented?

I envison a Functor (callable object) that is set up with the Nums that are updated, which stores Parameter indices internally and then can be applied to a Problem or a MTKParameters object together with the vector or tuple (if parameters are of different types) to efficiently create a modified Problem or MTKParameters object. Maybe it could store some intermediate information when it encouters a new type, such as a Dual number to make the next update on that type more efficient.

something like:

u_nums_updated = []
p_nums_updated = [γ, δ]
my_remaker = create_problem_remaker(odeprob, u_nums_updated, p_nums_updated)
probo = my_remaker(odeprob, (1.0, -20))
ChrisRackauckas commented 5 months ago

Could a non-mutating version of setp be implemented?

That's probably the right solution

AayushSabharwal commented 5 months ago

Could a non-mutating version of setp be implemented?

That's basically what remake_buffer does. Unfortunately, it requires a symbolic map of all parameter values. What you could do is first replace to create a new parameter object where everything is a Dual, and then use setp to update the specific parameters to optimize.

function loss(x, p)
    prob = p[1]
    ps = parameter_values(prob)
    tunable, _, _ = canonicalize(Tunable(), ps)
    newps = replace(Tunable(), ps, eltype(x).(tunable)) # or use a better method to convert the array
    setp(prob, parameters_to_optimize)(newps, x)
    newprob = remake(prob; p = newps)
    # ...
end

This can be made non-allocating using PreallocationTools.jl

ChrisRackauckas commented 5 months ago

Unfortunately, it requires a symbolic map of all parameter values

That seems a bit odd. Why not allow it to take a subset and then just promote the others?

AayushSabharwal commented 5 months ago

I ended up refuting my own arguments while writing a response 😅 so I guess we can make it so that remake_buffer calls promote_type on the arguments it recieves, and converts all existing values to that type.

AayushSabharwal commented 5 months ago

https://github.com/SciML/ModelingToolkit.jl/pull/2633 and https://github.com/SciML/SymbolicIndexingInterface.jl/pull/66 is what we need

bgctw commented 3 months ago

What are the implications of these developments? From the discussion above, I understood that setp is going to support AD, right?

When I update packages and run the setp example code in this issue , I still get the error:

MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{var"#1#2", Float64}, Float64, 1})