flennerhag / mlens

ML-Ensemble – high performance ensemble learning
http://ml-ensemble.com
MIT License
843 stars 108 forks source link

Model selection MLPRegressor hidden_layer_sizes #125

Closed trierweiler closed 4 years ago

trierweiler commented 4 years ago

Hi, I am trying to tune MLPRegressor( ) meta learner but returns Warning:

"C:\Users...\mlens\model_selection\model_selection.py:585: UserWarning: No valid parameters found for 1st_layer.MLP_log. Will fit and score once with given parameter settings. "settings.".format(key))

I think this is because I included hidden_layer_sizes (tuple) and activation (list of strings) parameters in params_dict. Apparently it is possible to tune only numerical parameters isn't it?

flennerhag commented 4 years ago

Hey sorry for late follow up!

It should be possible to use strings and tuple. You're probably seeing this error because you are breaking the API. In particular, the param dict need to specify a distribution for each key. This distribution should be an instance that has an rvs method.

Here's how we draw parameters:

    def _draw_params(self, param_dists):
        """Draw a list of param dictionaries for estimator."""
        # Set up empty list of parameter setting
        param_draws = [{} for _ in range(self.n_iter)]

        # Fill list of parameter settings by param
        for param, dist in param_dists.items():
            draws = dist.rvs(size=self.n_iter, random_state=self.random_state)

            for i, draw in enumerate(draws):
                param_draws[i][param] = draw

        return param_draws

If calling this method fails, you get the above warning, which is a bit confusing unfortunately and something we should fix.

mlens assume the distribution uses the scipy API. You might be able to find a scipy function for your needs, but you can easily create one yourself as well:

class MyDist:

    def __init__(self, support):
        self.support = support

    def rvs(size):
        return random.choices(self.support, k=size)