Ohjeah / sparsereg

a collection of modern sparse (regularized) linear regression algorithms.
MIT License
63 stars 21 forks source link

sindy model features #12

Closed TarenGorman closed 6 years ago

TarenGorman commented 6 years ago

Proposal of features for sindy class:

Ohjeah commented 6 years ago

You should make the feature transformer a parameter which defaults to the Polynomial one (as described in the original paper).

TarenGorman commented 6 years ago

Okay I will try doing a toggle for the transformer based on an operators param. So if no operators are given it will default to the PolynomialFeatures transformer.

Ohjeah commented 6 years ago

I was thinking more like:

def __init__(self, feature_transformer=None):
     self.feature_transformer = feature_transformer or PolynomialFeatures(degree=3, include_bias=False)
TarenGorman commented 6 years ago
def __init__(self, alpha=1.0, threshold=0.1, degree=3, operators=None): 
        self.alpha = alpha
        self.threshold = threshold
        self.degree = degree
        self.operators = operators

def fit(self, x, y=None):
        if self.operators is not None:
            feature_transformer = SymbolicFeatures(exponents=np.linspace(1,self.degree,self.degree), operators=self.operators)
        else:
            feature_transformer = PolynomialFeatures(degree=self.degree, include_bias=False)

So I see what you mean with your snippet, but the above is a nice internalization of the code so the params for sindy are simpler. Like the paper param "usesine", instead of having the user call another library in the param.

Though it is nice to have more freedom of transformers as a param. It is up to you which is best, maybe both?

Ohjeah commented 6 years ago

That works as well. I wanted to change the SymbolicTransformer too such that it mimics more the Polynomial if only exponents are given.

TarenGorman commented 6 years ago

Okay then I will do it this way for now. Yeah that'd be great, once that is changed PolynomialFeatures and the if/else can be taken out.

Ohjeah commented 6 years ago

The sklearn implementation will be faster :-)