SciML / Catalyst.jl

Chemical reaction network and systems biology interface for scientific machine learning (SciML). High performance, GPU-parallelized, and O(1) solvers in open source software.
https://docs.sciml.ai/Catalyst/stable/
Other
455 stars 75 forks source link

Update doc discussion of parameter typing and check tests #1011

Open isaacsas opened 1 month ago

isaacsas commented 1 month ago

Given that MTK has changed behavior now and is promoting all tunables to a common type, the doc discussion of parameter typing needs to be checked and updated accordingly (or perhaps just removed until this is all clearly stabilized in the future). Likewise tests need to be checked that they are still testing what they were intended to test (even if passing).

TorkelE commented 1 month ago

Sounds good. If I understand it correctly, the big difference is that any non-default types cannot be tunable (and we should put a note about that wherever relevant)?

ChrisRackauckas commented 1 month ago

Tunables are always kept a vector, so if you say remake some to dual then it will convert them all to dual. This of course is a requirement for any AD or optimization library, and this is also a requirement for any adjoint overload.

isaacsas commented 1 month ago

I think it is that integer type parameters are not tunable and can be declared as we currently do. Any mixture of float type parameters are considered tunable and promoted to a common type. So mixing Float32 and 64 symbolics is will result in all one type ultimately (I guess Float64).

ChrisRackauckas commented 1 month ago

@AayushSabharwal can we add a check in MTK that any tunable has a symtype of AbstractFloat?

ChrisRackauckas commented 1 month ago

Any mixture of float type parameters are considered tunable and promoted to a common type. So mixing Float32 and 64 symbolics is will result in all one type ultimately (I guess Float64).

Yes exactly, it will get the promotion of all tunables.

ChrisRackauckas commented 1 month ago

Which BTW is the same semantics of the AD and optimization libraries. It's just matching that behavior to ensure compatability of the generated functions. So it's not symbolically possible to generate an MTK model for which the tunable set is not compatible with actually being tuned. That is the major change.

TorkelE commented 1 month ago

Basically it seems as long as one doesn't mix no-default parameter types and anything which requires AD/tuning, one should be fine. And if one do, one should know what one is doing.

Would it make sense to have a more detailed section in MTK docs, and then we can link that from Catalyst where relevant?

ChrisRackauckas commented 1 month ago

We will need to write one, yes.

AayushSabharwal commented 1 month ago

@AayushSabharwal can we add a check in MTK that any tunable has a symtype of AbstractFloat?

We already do. Since @parameters by default has a symtype of Real and forcing people to annotate ::AbstractFloat would be breaking, it checks if symtype == Real || symtype <: AbstractFloat (and the equivalent for arrays). Any variable not passing this is put in the constants portion.