Closed TarenGorman closed 7 years ago
You should make the feature transformer a parameter which defaults to the Polynomial one (as described in the original paper).
Okay I will try doing a toggle for the transformer based on an operators param. So if no operators are given it will default to the PolynomialFeatures transformer.
I was thinking more like:
def __init__(self, feature_transformer=None):
self.feature_transformer = feature_transformer or PolynomialFeatures(degree=3, include_bias=False)
def __init__(self, alpha=1.0, threshold=0.1, degree=3, operators=None):
self.alpha = alpha
self.threshold = threshold
self.degree = degree
self.operators = operators
def fit(self, x, y=None):
if self.operators is not None:
feature_transformer = SymbolicFeatures(exponents=np.linspace(1,self.degree,self.degree), operators=self.operators)
else:
feature_transformer = PolynomialFeatures(degree=self.degree, include_bias=False)
So I see what you mean with your snippet, but the above is a nice internalization of the code so the params for sindy are simpler. Like the paper param "usesine", instead of having the user call another library in the param.
Though it is nice to have more freedom of transformers as a param. It is up to you which is best, maybe both?
That works as well. I wanted to change the SymbolicTransformer too such that it mimics more the Polynomial if only exponents are given.
Okay then I will do it this way for now. Yeah that'd be great, once that is changed PolynomialFeatures and the if/else can be taken out.
The sklearn implementation will be faster :-)
Proposal of features for sindy class: