Closed jankap closed 8 months ago
Hey, @jankap ! Thanks for your message! Regarding your questions:
Every method minimize the prediction error but MetaMSS. I developed MetaMSS during my master degree, so you won't find many sources about it. Its based on metaheuristics algorithms and the idea is to find the best model that minimize the simulation error. You can optimize the model based on free run simulation or n-steps-ahead prediction. Take a look at this example and let me know If that is useful for you (for now its the only method I have implemented regarding simulation error minimization).
You are right. Sometimes both terms are used as synonyms, but they are not the same. Actually, the main difference is in the training process: the NARMAX model have lagged noise terms to try to remove the bias in the parameter estimation in colored noise scenarios. But after you get the model parameters, you drop the lagged noise terms and use the model with only input and output regressors. Take a look at this example to check this effect. Using the extended_least_squares algorithm in SysIdentPy means you are getting a NARMAX model. If that parameter is False you are getting an NARX model (or some other variation without the MA part).
You can check this out in the Billings Book (Nonlinear System Identification: NARMAX Methods in the Time, Frequency, and Spatio-Temporal Domains) or you can check his paper (https://eprints.soton.ac.uk/251147/2/778742007_content.pdf) on section 2, equation 4. There are other papers (from him or by different authors) that I can link to you if you want. Its worth to mention that adding lagged noise terms in the training process can be tricky because NARMAX models becomes nonlinear in its parameters.
You can define custom regressors, but only polynomials ones for now. Check this example for more details. There is not an easy way to set a custom regressor using a different basis function (at least I can't figure out how we could do it in the current stage of the package without adding a entirely new method specifically for that). But if you need custom polynomial regressors, you are good. EDIT: maybe there is a way to add custom regressors that aren't polynomials by implementing it as a custom basis function. Its not the best way since it would require more work in your code, but it could be done.
Again, thanks for your message. Let me know if I you need anything else.
Hey @jankap ! I'm closing this issue because I think I've covered your questions, but if you have anything else you can comment here or open a discussion in the project so we can keep talking about the subject.
First of all, this toolbox evolved great since the last time I've had a look, wow. Currently I'm working with the Matlab System Identification Toolbox, but I'll experiment with this implementation here, too. However, I have a couple of questions to be clarified :)
y(k-1)*u(k)^2
and similar. See also https://www.mathworks.com/help/ident/ref/customregressor.htmlThank you very much and keep up the great work!