neurospin / pylearn-parsimony_history

Sparse and Structured Machine Learning in Python
BSD 3-Clause "New" or "Revised" License
0 stars 1 forks source link

Logistic regression and Dynamic CONESTA #26

Open tomlof opened 10 years ago

tomlof commented 10 years ago

We currently can not support Logistic regression with Dynamic CONESTA.

We have not added e.g. the gap and betahat to Logistic regression with L1, L2 and Nesterov functions. Until we do that, we can not use Dynamic CONESTA when minimising logistic regression with penalties.

duchesnay commented 10 years ago

I think that this is not a major issue since StaticCONSETA compete well. Moreover we may think about giving up Dynamic CONESTA in a future release if we cannot demonstrate a major improvement of using it.

FH235918 commented 10 years ago

We can add the Excessive Gap wfor this model for the moment until fixing the Dynamic version.

vguillemot commented 10 years ago

The Excessive Gap is not ready yet to be applied to Logistic Regression!

tomlof commented 10 years ago

I think we should wait with scrapping Dynamic CONESTA at least until the paper is published ;-)

Perhaps Fouad can write the necessary parameters for logistic regression, such as how to compute the dual gap, beta hat, optimal mu, optimal eps and so on. If so, I can add it properly. Fouad, maybe we can spend a day when I visit the next time to do this?