Closed lberti closed 1 year ago
Hi @lberti, thanks for sharing your code. There are a few things going on here.
Using a basis (even when there is no dimension reduction) introduces a coordinate transform, so you should not expect rom.A_.entries
to be the same as A
unless the basis is the identity matrix. Since you aren't looking for dimension reduction, you can 'turn off' the basis by using None
as the first argument of fit()
instead of basis
.
If your goal is to recover the full-order system matrices, then you will need exact time derivative data. Operator inference is sensitive to error in the time derivatives, which is why the regularization needs to be chosen carefully in general. Based on my tinkering it looks like L2 regularization cannot fully mitigate the issue for this problem, perhaps because the full-order system matrices are sparse. We do not yet have L1 (sparsity-promoting) regularization implemented in opinf
yet, but that would likely. In that case, because there is no dimension reduction and the form is polynomial, opinf would be equivalent to SINDy with a linear+quadratic candidate library.
You can read about operator inference recovering ROMs based on Galerkin projection in this paper, or Section 4.1.2 of this thesis. I'm also attaching a notebook that demonstrates recovery of the full-order system operators based on your code.
(This is a jupyter notebook, rename it issue40.ipynb
and open it with jupyter notebook issue40.ipynb
or jupyter lab issue40.ipynb
)
I am indeed planning to do model reduction but first I wanted to be able to could completely recover my model. Thank you very much for the explanation and the references!
Hello,
I am trying to use this package to tackle the Lotka-Volterra system. I don't really expect it to be reducible, but I would like to recover the model matrices using your library. In theory, since the right-hand side is polynomial, I would expect to recover it pretty accurately. However, I do not manage to do so.
Please find attached the Python file that I am using and the problems that I encounter:
Thank you in advance.