Main goal would be to define the Objective analogous to kopt' CompileFN but now using the gin-config. Arguments of Objective would be same as arguments to gin_train but where the gin-config files would be normal gin files which would be overriden using gin bindings (https://github.com/Avsecz/gin-train/blob/master/gin_train/cli/gin_train.py#L187). E.g. either pass to parse_config_files_and_bindings as bindings or specify by:
where values can be any valid python object (lists, tuples, dicts, strings). Note that if we use bind_parameter, then finalize() should only be called after we have bound all the parameters.
import json
def config2bindings(config):
return [f"{k} = {json.dumps(v)}" for k,v in config.items()]
config = {'asd': [1,2,3],
...: "dsa": 10,
...: "dsads": "11",
...: "dsasdsadas": {"a": 1}}
bindings = config2bindings(config)
In [9]: for p in bindings: print(p)
asd = [1, 2, 3]
dsa = 10
dsads = "11"
dsasdsadas = {"a": 1}
Both assume that the dictionary would solely be a key-value mapping which might contain dictionaries / lists as values but these will not be interpreted as nested variables.
Note - note_params should be used to keep track of the hyper-parameter optimization study and the run-id
Additional arguments to Objective
objective_metric="acc", # which metric to optimize for. can be nested if multiple
objective_metric_mode="max"
Multiple different Objective versions would need to be implemented, one for each hyper-parameter optimization system.
Supported backends:
ray tune - RayObjective(...)
which also supports HyperOpt
(maybe) hyperopt - HyperoptObjective
For more advanced scenarios we would probably need to implement the Trainable class ourselves
Main goal would be to define the
Objective
analogous to kopt'CompileFN
but now using the gin-config. Arguments of Objective would be same as arguments togin_train
but where the gin-config files would be normal gin files which would be overriden using gin bindings (https://github.com/Avsecz/gin-train/blob/master/gin_train/cli/gin_train.py#L187). E.g. either pass toparse_config_files_and_bindings
asbindings
or specify by:where values can be any valid python object (lists, tuples, dicts, strings). Note that if we use
bind_parameter
, thenfinalize()
should only be called after we have bound all the parameters.Both assume that the dictionary would solely be a key-value mapping which might contain dictionaries / lists as values but these will not be interpreted as nested variables.
Note -
note_params
should be used to keep track of the hyper-parameter optimization study and the run-idAdditional arguments to
Objective
Multiple different
Objective
versions would need to be implemented, one for each hyper-parameter optimization system.Supported backends:
RayObjective(...)
HyperoptObjective
For more advanced scenarios we would probably need to implement the
Trainable
class ourselves