Closed kiudee closed 4 years ago
One possible simple wrapper could look like this:
import inspect
def wrap_class(cl):
class Wrapper(cl):
def __init__(self, **parameters):
p = list(inspect.signature(super().__init__).parameters.keys())
self.sk_params = p
super().__init__(**parameters)
def get_params(self, **kwargs):
d = {}
for p in self.sk_params:
d[p] = getattr(self, p)
return d
def set_params(self, **parameters):
for parameter, value in parameters.items():
if parameter in self.sk_params:
setattr(self, parameter, value)
return self
return Wrapper
This works for all non-compatible classes which obey the "hyperparameters are all given as __init__
parameters" convention.
Ok, that idea also appears to not work 100% correctly. BaseEstimator.get_params
recursively calls get_params
on its parameters (if available). We pass the optimizers in their uninitialized form as class and later initialize it.
Thus get_params
will throw an error there.
I will think about possible solutions. @timokau ideas from your side?
Interesting problem, I did not consider the case where you would want to set nested parameters. To me it seems easiest to implement set_params
and get_params
in our Learner
class. It shouldn't be too complicated (famous last words), it should just set all non-nested parameters as attributes and call the initialize_optimizer
(etc) function again.
What do you think?
Interesting problem, I did not consider the case where you would want to set nested parameters.
Indeed, in the keras wrapper they go even a step further and also consider the parameters passed to fit
and predict
.
To me it seems easiest to implement
set_params
andget_params
in ourLearner
class. It shouldn't be too complicated (famous last words), it should just set all non-nested parameters as attributes and call theinitialize_optimizer
(etc) function again.What do you think?
I agree that should be the easiest solution right now. Most probably there are edge cases which are hard to predict right now.
One additional consideration is that the hyperparameters of the keras optimizers are not necessarily well-behaved. The official interface uses self._set_hyper()
to declare a hyperparameter, but these typically are a subset of the parameters passed into __init__
.
Examples:
initial_accumulator_value
, l2_regularization_strength
etc, but initial_accumulator_value
is not declared as hyperparameter.clipnorm
and clipvalue
are specified in the parent class.I think the easiest way method to interface with these optimizers should be to use the official get_config
interface, where all parameters are listed (parameters like name
can be filtered out).
Fixed in #170.
A call like
fails with the error message:
This issue arises because SGD (or other optimizers) is not compatible with sklearn’s hyperparameter API. Keras itself implements an sklearn wrapper where different parameter categories are distinguished (depending on where these parameters are passed). A change like that sounds like a lot of work though.
One workaround would be to implement a wrapper for the optimizers. That way they would natively support get_params/set_params. Any other ideas for solutions?