lacava / few

a feature engineering wrapper for sklearn
https://lacava.github.io/few
GNU General Public License v3.0
50 stars 22 forks source link

Error with installation #39

Closed GinoWoz1 closed 5 years ago

GinoWoz1 commented 5 years ago

Hello,

Trying to attempt this package but running into some issues, any idea? I have VS 14.16 now on PC and getting this error when typing 'pip install few'. At first it was asking for eigency but now after that installation this error popped up.

image

Sincerely, G

lacava commented 5 years ago

Congrats, I think you're the first person I've talked to test the windows install!

I think the issue is that the flag to specify the c++ standard to compile with is different in VS than the GNU compiler. Could you try this and let me know if it fixes it:

clone the repo directly rather than installing via pip. from the code directory, change line 79 of setup.py to

extra_compile_args = ['/std:c++latest'])],

then from the code directory run

python setup.py install

to install it. This should work.

GinoWoz1 commented 5 years ago

Thanks getting more errors now. The unrecognized issue is not showing up but now getting eigency errors.

image image

image

lacava commented 5 years ago

I've pushed an updated version of Few to pypi (FEW-0.0.51) that should address the OS issue with extra_compile_args and the iota error and random_shuffle error.

I'm not sure what is going on with the eigency errors. There is a newer stable release of Eigen (3.3.5) and there is an option in eigency to point to a local copy of Eigen. that might work if my changes don't make those errors magically go away.

GinoWoz1 commented 5 years ago

Thanks! So is that available now?

lacava commented 5 years ago

yep, you can clone or install via pip

GinoWoz1 commented 5 years ago

Thanks alot @lacava . You can close this out.

Quick question, how do I ask FEW to minimize? it is seemingly trying to maximize the loss score. I am using RMSLE so ideally I should be minimizing this.

Sincerely, G

lacava commented 5 years ago

there are two different loss fns used by FEW: the one that assigns fitness to the features in the model (controlled by self.fit_choice) and the one that keeps track of the best model so far, controlled by self.scoring_function.

self.scoring_function has to be maximized. But you can pass in a scoring function that inverts the RMSLE. I haven't tried this but something like this should work

def minusRMSLE(y, y_pred): 
    return - RMSLE( y, y_pred ) 

est = FEW(scoring_function = minusRMSLE)

you can also pass any of these

GinoWoz1 commented 5 years ago

Thanks. I tried it but my initial CV score is almost my best that I got, by 100 generations the score has gotten significantly worse. I will open another issue on questions regarding current ml validation score.