autonomio / talos

Hyperparameter Experiments with TensorFlow and Keras
https://autonom.io
MIT License
1.62k stars 268 forks source link

loss_entropy vs acc_entropy referenced before assignment #331

Closed prestonbrown-me closed 5 years ago

prestonbrown-me commented 5 years ago

For models that do not have an accuracy metric, what am I to do when I get the following error?

local variable 'acc_entropy' referenced before assignment

If you still have an error, please submit complete trace and a code with:

You can provide the code in pastebin / gist or any other format you like.


Trace:

---------------------------------------------------------------------------
UnboundLocalError                         Traceback (most recent call last)
<ipython-input-34-49668748edfd> in <module>()
      1 dummy_x = np.empty((1, num_x_signals))
      2 dummy_y = np.empty((1, num_y_signals))
----> 3 t = ta.Scan(dummy_x, dummy_y, p, generate_model, reduction_metric='val_loss')

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/talos/scan/Scan.py in __init__(self, x, y, params, model, dataset_name, experiment_no, experiment_name, x_val, y_val, val_split, shuffle, round_limit, time_limit, grid_downsample, random_method, seed, search_method, permutation_filter, reduction_method, reduction_interval, reduction_window, reduction_threshold, reduction_metric, reduce_loss, last_epoch_value, clear_tf_session, disable_progress_bar, print_params, debug)
    183         # input parameters section ends
    184 
--> 185         self._null = self.runtime()
    186 
    187     def runtime(self):

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/talos/scan/Scan.py in runtime(self)
    188 
    189         self = scan_prepare(self)
--> 190         self = scan_run(self)

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/talos/scan/scan_run.py in scan_run(self)
     18     # start the main loop of the program
     19     while len(self.param_log) != 0:
---> 20         self = scan_round(self)
     21         self.pbar.update(1)
     22         if self.time_limit is not None:

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/talos/scan/scan_round.py in scan_round(self)
     47     # create log and other stats
     48     try:
---> 49         self.epoch_entropy.append(epoch_entropy(_hr_out))
     50     except (TypeError, AttributeError):
     51         raise TalosReturnError("Make sure that input model returns in the order 'out, model'")

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/talos/metrics/entropy.py in epoch_entropy(history)
     52             loss_entropy = nan
     53 
---> 54     return [acc_entropy, loss_entropy]

UnboundLocalError: local variable 'acc_entropy' referenced before assignment

Model:

p = {
'first_neuron': [25, 256, 1024],
'batch_size': [10, 20, 30],
'dropout': [0.1, 0.15, 0.2, 0.25],
'lr': [1e-3, 1e-4, 1e-5, 1e-6]
}

def generate_model(x_train, y_train, x_val, y_val, params):
    # replace the hyperparameter inputs with references to params dictionary 
    model = Sequential()
    optimizer = RMSprop(lr=params['lr'])
    model.add(CuDNNGRU(units=params['first_neuron'],
                 return_sequences=True,
                 input_shape=(None, num_x_signals)))
    model.add(Dropout(params['dropout']))
    model.add(Dense(num_y_signals, activation='sigmoid'))
    model.compile(loss=loss_mse_warmup, optimizer=optimizer)

    # make sure history object is returned by model.fit()
    out = model.fit_generator(generator=generator,
                    epochs=30,
                    steps_per_epoch=50,
                    validation_data=validation_data,
                    callbacks=callbacks)

    # modify the output model
    return out, model

Scan: t = ta.Scan(dummy_x, dummy_y, p, generate_model, reduction_metric='val_loss')

prestonbrown-me commented 5 years ago

Using DEV, I do not get this issue. Wondering what the cause of this might be.

mikkokotila commented 5 years ago

The issue was fixed. You should not get this from a certain version onwards. I think v.0.5. Closing here. Feel free to open new issue/s if anything.

Happy mid-summer too! :)