keras-team / keras-tuner

A Hyperparameter Tuning Library for Keras
https://keras.io/keras_tuner/
Apache License 2.0
2.85k stars 395 forks source link

tqdm - callback is not deep-copyable #392

Open jtlz2 opened 4 years ago

jtlz2 commented 4 years ago

I am trying to use a tqdm callback - which does work for me with model.fit() - but in keras-tuner. This triggers the following error:


from tqdm.keras import TqdmCallback

# [...]

callbacks_list = []
tqdm_cb = TqdmCallback(verbose=0)
tqndm_cb = TQDMNotebookCallback()
cto_cb = ClearTrainingOutput()
callbacks_list.extend([tqdm_cb])

num_epochs = 5000
batch_size=64
tuner.search(train_data, train_labels, epochs=num_epochs, batch_size=batch_size, \
                    validation_data=(validation_data, validation_labels), verbose=2, \
                    callbacks = callbacks_list)
ValueError: All callbacks used during a search should be deep-copyable (since they are reused across trials). It is not possible to do `copy.deepcopy([<tqdm.keras.TqdmCallback object at 0x7f553eb3f890>])`

keras-tqdm doesn't work for me either.

I am not sure on whose side this falls. Is there an easy way to make tqdm (or any other callback) deep-copyable?

Is there another progressbar alternative?

Any idea how to fix this / or is there any other workaround?

Thanks of course for keras-tuner - amazingly valuable!

yixingfu commented 4 years ago

Keras-Tuner requires callbacks to be deep-copyable because it need a fresh copy of the same callback for each trial (each time the tuner calls model.fit). I don't think there is a very simple way to get around this.

To better understand what you need, are you trying to do a tqdm for the search progress or the fitting in each trial?

dziatkowski commented 3 years ago

I had the same error when using the list of callbacks with Keras-Tuner. In my case it helped when I had switched from something like this

callbackInstance1 = callbackClass1(param=param)
callbackInstance2 = callbackClass2(param=param)
callbacks = [callbackInstance1, callbackInstance2]
tuner.search(
    ...
    callbacks=callbacks
   ...)

to something like this

callbacks = [callbackClass1(param=param), callbackClass2(param=param)]
tuner.search(
    ...
    callbacks=callbacks
   ...)

I.e. I instantiate callbacks within the list. It helped for the list containing both tf.keras.callbacks.EarlyStopping as well as custom callbacks.

ylchang commented 2 years ago

Has this issue been solved? I used my own computer to carry out hyberband turner on 2 models: Single-Layer LSTM model and Two-Layer LSTM model. There's no issue while tunning the single-layer LSTM, but got the following error message while tunning 2-layer LSTM: ValueError: All callbacks used during a search should be deep-copyable (since they are reused across trials). It is not possible to do copy.deepcopy([<keras.callbacks.EarlyStopping object at 0x000002151B2F6470>])

The funny thing is that when I ran the same code on Google Colab, none of the model returned any error message; both of the models can be tuned by the hyberband tuner. Would this hinting that this issue is related to the version of keras-tuner? (the one on my computer is 1.1.2. I am not sure what's the version on Google Colab though.

haifeng-jin commented 2 years ago

@ylchang You can always use keras_tuner.__version__ to check the version info.