Open awecefil opened 3 months ago
Thanks for the issue, this is very interesting!
For canGetAndSetFMUstate="false"
the default fall-back is to use sampling and finite differences.
We will check that!
Hi @ThummeTo , I doubt that the first issue not only happened when using FMUParameterRegistrator
.
I also do experiment about learning unknown effect example like the example of hybrid_ME on SpringPendulum and I get worse result
# NeuralFMU setup
numStates = fmiGetNumberOfStates(referenceFMU)
net = Chain(x -> referenceFMU(x=x, dx_refs=:all),
Dense(numStates, 8, identity),
Dense(8, 8, tanh),
Dense(8, numStates)
)
optim = Adam()
solver = Tsit5()
neuralFMU = ME_NeuralFMU(referenceFMU, net, (tStart, tStop), Tsit5(); saveat=tSave)
solutionBefore = neuralFMU(x₀; parameters=init_params, saveat=tSave)
posNet_before, velNet_before = extractPosVel(solutionBefore)
paramsNet = Flux.params(neuralFMU)
params_before = deepcopy(paramsNet)
for i in 1:length(paramsNet[1])
if paramsNet[1][i] < 0.0
paramsNet[1][i] = -paramsNet[1][i]
end
end
NUMSTEPS = 1000
GRADIENT = :ReverseDiff
FMIFlux.train!(lossSum, neuralFMU, Iterators.repeated((), NUMSTEPS), optim; gradient=GRADIENT, cb=()->callb(paramsNet), printStep=false)
[ Info: LossSum[0] - Loss: 125.24109
[ Info: LossSum[1] - Loss: 124.58794
[ Info: LossSum[100] - Loss: 53.56777
[ Info: LossSum[200] - Loss: 1.509
[ Info: LossSum[300] - Loss: 0.81575
[ Info: LossSum[400] - Loss: 0.76544
[ Info: LossSum[500] - Loss: 0.76557
[ Info: LossSum[600] - Loss: 0.76563
[ Info: LossSum[700] - Loss: 0.76606
[ Info: LossSum[800] - Loss: 0.76679
[ Info: LossSum[900] - Loss: 0.76561
It seems like it easily stucks at local optimum and the result is show below
The difference of calling neuralFMU(x₀; p=p, showProgress=true, saveat=tSave)
in and out FMIFlux.train!() is below, where in FMIFlux.train!()
means the calling of neuralFMU(...)
is in loss function during the training process and out FMIFlux.train!()
means calling neuralFMU(...)
outside the training process like solutionBefore = neuralFMU(x₀; saveat=tSave)
this plot shows that the FMU output in the training process is linear, which is weird I think
NOTE: the fmu I use in above results is exported by OpenModelica v1.23.0 (64-bit), which may be the main reason but not sure why
Can you compare the gradients?
Like applying ForwardDiff.gradient(loss, p)
(or ReverseDiffGradient(loss, p)
).
This way we know that it's not by the accumulated update direction of Adam.
There are some tests in FMISensitivity.jl that you can check out: FMISensitivity tests
(here, we compare many gradients to ground truth gradients)
Currently, I am trying to use
FMUParameterRegistrator
to do parameter calibration/estimation but encounter some issues.I do experiments on both ME-type & CS-type FMU:
Model:SpringPendulum1D.mo (from FMIZoo.jl) Exporting Tool:Dymola (I dircetly use the FMU from FMIZoo.jl) Result:good, parameter can be tuned correctly after training
Model:SpringPendulum1D.mo (from FMIZoo.jl) Exporting Tool:OpenModelca v1.23.0 (64-bit) Result:the loss doesn't change during training so the parameter is not tuned correctly, below is the loss function I use and part of info during training process
Actually, I found this issue comes from the wrong return value of
neuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave)
inlossSum
because theposNet
is an array with same value for all time steps andvelNet
is an array with linear incremental values. For example, theposNet = [0.5, 0.5, 0.5, ..., 0.5, 0.5, 0.5]
and thevelNet = [0.0, 0.1, 0.2, 0.3, ..., 40.0]
, however, both of them should be like a string wave.By the way, this only happens when doing
FMIFlux.train!
. It is normal if I runneuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave)
independently.I am not sure whether
canGetAndSetFMUstate="false"
may be the possible reason that cause weird FMU solution? Because OpenModelica seems not support ME-type FMU to support this functionality even I follow the guideline of OpenModelica to enable it.If I negtivelize the return value of loss, the training is relatively normal but I think this is not a good way to do
Conclusion
There is two issues when using
FMUParameterRegistrator
:If more information is needed, please tell me, thank you!