SciML / ModelingToolkit.jl

An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
https://mtk.sciml.ai/dev/
Other
1.42k stars 207 forks source link

Idea: Typed axes for array variables and parameters #9

Closed tkf closed 3 years ago

tkf commented 6 years ago

When there are multiple array variables and parameters in the model you have, you need to match the sizes of them. Rather than manually verifying the consistency of them, a better approach would be to encode the semantics of each "axis" in the computation graph. It enables SciCompDSL.jl to catch some errors at graph creation time.

Example (from my previous comment: https://github.com/JuliaDiffEq/DifferentialEquations.jl/issues/261#issuecomment-368772516):

a = ArrayAxis(:a)
b = ArrayAxis(:b)
xa = DependentVariable(:xa, a)
xb = DependentVariable(:xb, b)

# Parameters are block matrices
Maa = Parameter(:Maa, a, a)  # Na by Na matrix
Mab = Parameter(:Mab, a, b)  # Na by Nb matrix
Mbb = Parameter(:Mbb, b, b)  # Nb by Nb matrix
τa = Parameter(:τa, a)     # Na-dimensional vector
τb = Parameter(:τb, b)     # Nb-dimensional vector

de = @diffeq begin
  D*xa ./ τa = -xa .+ tanh.(Maa * xa .+ Mab * xb)
  D*xb ./ τb = -xb .+ tanh.(Mbb * xb)
end

where Na = size(Mab, 1) and Nb = size(Mab, 2) when Mab is a normal array. It is, as a whole, Na + Nb-dimensional ODE.

Possible benefits:

Related project:

tkf commented 6 years ago

There is a discussion on adding "Named Dimensions" to Open Neural Network Exchange:

But probably the discussion is too much deep net-oriented.

ChrisRackauckas commented 3 years ago

We essentially have this now!