Closed SMartinson24 closed 1 week ago
Thanks for posting, Sharon.
I downloaded the data from your example and ran the training with default settings and (on Windows with GUI 1.3.0) everything seems to work fine.
The error log above doesn't look pretty but actually doesn't contain any errors, "just" warnings.
I have two quick questions:
Thanks, Stefan
No activity on this issue, so I'm closing it. Please get back to us if there problem still exists.
Describe the bug When trying to train data (either with my own data or the stardard Danum training example set), it stops and throws an error. We use all the defaults. Using BirdNET version downloaded and installed today (GUI 1.2.0,; Model 2.4). Tried on both Mac and PC .
To Reproduce downloaded latest BirdNET and installed according to setup instructions. Downloaded Danum data set from https://zenodo.org/records/10790619 Train (leave all defaults for settings) Loads training data folders, but stops with nebulous Red Error button.
Expected behavior Train BirdNET using the Yang Danum dataset
Screenshots
Desktop (please complete the following information): Using BirdNET version downloaded and installed today (GUI 1.2.0,; Model 2.4). Tried on both Mac and PC Win11 .
Additional context Log from session: WARNING:tensorflow:From keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead.
...Done. Training model... Training on 32 samples, validating on 8 samples. Epoch 1/50 WARNING:tensorflow:AutoGraph could not transform <function Model.make_train_function..train_function at 0x0000026180632320> and will run it as-is.
Cause: Unable to locate the source code of <function Model.make_train_function..train_function at 0x0000026180632320>. Note that functions defined in certain environments, like the interactive Python shell, do not expose their source code. If that is the case, you should define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.experimental.do_not_convert. Original error: could not get source code
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING:tensorflow:AutoGraph could not transform <function custom_loss at 0x00000261E3E28820> and will run it as-is.
Cause: Unable to locate the source code of <function custom_loss at 0x00000261E3E28820>. Note that functions defined in certain environments, like the interactive Python shell, do not expose their source code. If that is the case, you should define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.experimental.do_not_convert. Original error: could not get source code
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING:tensorflow:From keras\src\utils\tf_utils.py:492: The name tf.ragged.RaggedTensorValue is deprecated. Please use tf.compat.v1.ragged.RaggedTensorValue instead.
1/1 [==============================] - ETA: 0s - loss: 2.2361 - AUPRC: 0.2672 - AUROC: 0.5396WARNING:tensorflow:AutoGraph could not transform <function Model.make_test_function..test_function at 0x0000026181F35F30> and will run it as-is.
Cause: Unable to locate the source code of <function Model.make_test_function..test_function at 0x0000026181F35F30>. Note that functions defined in certain environments, like the interactive Python shell, do not expose their source code. If that is the case, you should define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.experimental.do_not_convert. Original error: could not get source code
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
1/1 [==============================] - 1s 978ms/step - loss: 2.2361 - AUPRC: 0.2672 - AUROC: 0.5396 - val_loss: 2.0265 - val_AUPRC: 0.3909 - val_AUROC: 0.5231 Epoch 2/50
1/1 [==============================] - ETA: 0s - loss: 1.9597 - AUPRC: 0.2751 - AUROC: 0.5677 1/1 [==============================] - 0s 65ms/step - loss: 1.9597 - AUPRC: 0.2751 - AUROC: 0.5677 - val_loss: 1.8322 - val_AUPRC: 0.4122 - val_AUROC: 0.5370 Epoch 3/50
1/1 [==============================] - ETA: 0s - loss: 1.7536 - AUPRC: 0.3057 - AUROC: 0.6308 1/1 [==============================] - 0s 57ms/step - loss: 1.7536 - AUPRC: 0.3057 - AUROC: 0.6308 - val_loss: 1.6932 - val_AUPRC: 0.4187 - val_AUROC: 0.5694 Epoch 4/50
1/1 [==============================] - ETA: 0s - loss: 1.6036 - AUPRC: 0.3640 - AUROC: 0.7164 1/1 [==============================] - 0s 63ms/step - loss: 1.6036 - AUPRC: 0.3640 - AUROC: 0.7164 - val_loss: 1.5925 - val_AUPRC: 0.4318 - val_AUROC: 0.6111 Epoch 5/50
1/1 [==============================] - ETA: 0s - loss: 1.4930 - AUPRC: 0.4442 - AUROC: 0.7980 1/1 [==============================] - 0s 64ms/step - loss: 1.4930 - AUPRC: 0.4442 - AUROC: 0.7980 - val_loss: 1.5147 - val_AUPRC: 0.4892 - val_AUROC: 0.7037 Epoch 6/50
1/1 [==============================] - ETA: 0s - loss: 1.4069 - AUPRC: 0.5395 - AUROC: 0.8568 1/1 [==============================] - 0s 72ms/step - loss: 1.4069 - AUPRC: 0.5395 - AUROC: 0.8568 - val_loss: 1.4486 - val_AUPRC: 0.5536 - val_AUROC: 0.7963 Epoch 7/50
1/1 [==============================] - ETA: 0s - loss: 1.3342 - AUPRC: 0.6925 - AUROC: 0.8964 1/1 [==============================] - 0s 55ms/step - loss: 1.3342 - AUPRC: 0.6925 - AUROC: 0.8964 - val_loss: 1.3872 - val_AUPRC: 0.6235 - val_AUROC: 0.8611 Epoch 8/50
1/1 [==============================] - ETA: 0s - loss: 1.2680 - AUPRC: 0.7992 - AUROC: 0.9256 1/1 [==============================] - 0s 57ms/step - loss: 1.2680 - AUPRC: 0.7992 - AUROC: 0.9256 - val_loss: 1.3270 - val_AUPRC: 0.7156 - val_AUROC: 0.8935 Epoch 9/50
1/1 [==============================] - ETA: 0s - loss: 1.2045 - AUPRC: 0.8624 - AUROC: 0.9465 1/1 [==============================] - 0s 58ms/step - loss: 1.2045 - AUPRC: 0.8624 - AUROC: 0.9465 - val_loss: 1.2669 - val_AUPRC: 0.8434 - val_AUROC: 0.9259 Epoch 10/50
1/1 [==============================] - ETA: 0s - loss: 1.1421 - AUPRC: 0.8984 - AUROC: 0.9627 1/1 [==============================] - 0s 60ms/step - loss: 1.1421 - AUPRC: 0.8984 - AUROC: 0.9627 - val_loss: 1.2071 - val_AUPRC: 0.8946 - val_AUROC: 0.9491 Epoch 11/50
1/1 [==============================] - ETA: 0s - loss: 1.0807 - AUPRC: 0.9236 - AUROC: 0.9716 1/1 [==============================] - 0s 58ms/step - loss: 1.0807 - AUPRC: 0.9236 - AUROC: 0.9716 - val_loss: 1.1484 - val_AUPRC: 0.9093 - val_AUROC: 0.9537 Epoch 12/50
1/1 [==============================] - ETA: 0s - loss: 1.0205 - AUPRC: 0.9507 - AUROC: 0.9829 1/1 [==============================] - 0s 62ms/step - loss: 1.0205 - AUPRC: 0.9507 - AUROC: 0.9829 - val_loss: 1.0917 - val_AUPRC: 0.9206 - val_AUROC: 0.9537 Epoch 13/50
1/1 [==============================] - ETA: 0s - loss: 0.9623 - AUPRC: 0.9804 - AUROC: 0.9931 1/1 [==============================] - 0s 58ms/step - loss: 0.9623 - AUPRC: 0.9804 - AUROC: 0.9931 - val_loss: 1.0380 - val_AUPRC: 0.9434 - val_AUROC: 0.9722 Epoch 14/50
1/1 [==============================] - ETA: 0s - loss: 0.9069 - AUPRC: 0.9911 - AUROC: 0.9968 1/1 [==============================] - 0s 58ms/step - loss: 0.9069 - AUPRC: 0.9911 - AUROC: 0.9968 - val_loss: 0.9879 - val_AUPRC: 0.9743 - val_AUROC: 0.9907 Epoch 15/50
1/1 [==============================] - ETA: 0s - loss: 0.8549 - AUPRC: 0.9953 - AUROC: 0.9983 1/1 [==============================] - 0s 59ms/step - loss: 0.8549 - AUPRC: 0.9953 - AUROC: 0.9983 - val_loss: 0.9420 - val_AUPRC: 0.9743 - val_AUROC: 0.9907 Epoch 16/50
1/1 [==============================] - ETA: 0s - loss: 0.8069 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 63ms/step - loss: 0.8069 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.9004 - val_AUPRC: 0.9868 - val_AUROC: 0.9954 Epoch 17/50
1/1 [==============================] - ETA: 0s - loss: 0.7632 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 61ms/step - loss: 0.7632 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.8632 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 18/50
1/1 [==============================] - ETA: 0s - loss: 0.7240 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 69ms/step - loss: 0.7240 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.8302 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 19/50
1/1 [==============================] - ETA: 0s - loss: 0.6891 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 57ms/step - loss: 0.6891 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.8010 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 20/50
1/1 [==============================] - ETA: 0s - loss: 0.6583 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 59ms/step - loss: 0.6583 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.7752 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 21/50
1/1 [==============================] - ETA: 0s - loss: 0.6311 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 62ms/step - loss: 0.6311 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.7522 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 22/50
1/1 [==============================] - ETA: 0s - loss: 0.6071 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 59ms/step - loss: 0.6071 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.7317 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 23/50
1/1 [==============================] - ETA: 0s - loss: 0.5858 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 66ms/step - loss: 0.5858 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.7133 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 24/50
1/1 [==============================] - ETA: 0s - loss: 0.5668 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 59ms/step - loss: 0.5668 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6967 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 25/50
1/1 [==============================] - ETA: 0s - loss: 0.5497 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 66ms/step - loss: 0.5497 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6816 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 26/50
1/1 [==============================] - ETA: 0s - loss: 0.5343 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 46ms/step - loss: 0.5343 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6678 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 27/50
1/1 [==============================] - ETA: 0s - loss: 0.5204 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 37ms/step - loss: 0.5204 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6552 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 28/50
1/1 [==============================] - ETA: 0s - loss: 0.5078 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 39ms/step - loss: 0.5078 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6438 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 29/50
1/1 [==============================] - ETA: 0s - loss: 0.4963 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 38ms/step - loss: 0.4963 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6334 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 30/50
1/1 [==============================] - ETA: 0s - loss: 0.4860 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 43ms/step - loss: 0.4860 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6241 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 31/50
1/1 [==============================] - ETA: 0s - loss: 0.4766 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 40ms/step - loss: 0.4766 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6157 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 32/50
1/1 [==============================] - ETA: 0s - loss: 0.4682 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 38ms/step - loss: 0.4682 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6081 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 33/50
1/1 [==============================] - ETA: 0s - loss: 0.4607 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 47ms/step - loss: 0.4607 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.6015 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 34/50
1/1 [==============================] - ETA: 0s - loss: 0.4540 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 37ms/step - loss: 0.4540 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5956 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 35/50
1/1 [==============================] - ETA: 0s - loss: 0.4481 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 38ms/step - loss: 0.4481 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5904 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 36/50
1/1 [==============================] - ETA: 0s - loss: 0.4430 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 38ms/step - loss: 0.4430 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5860 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 37/50
1/1 [==============================] - ETA: 0s - loss: 0.4384 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 39ms/step - loss: 0.4384 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5821 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 38/50
1/1 [==============================] - ETA: 0s - loss: 0.4345 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 42ms/step - loss: 0.4345 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5788 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 39/50
1/1 [==============================] - ETA: 0s - loss: 0.4312 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 39ms/step - loss: 0.4312 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5760 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 40/50
1/1 [==============================] - ETA: 0s - loss: 0.4283 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 42ms/step - loss: 0.4283 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5737 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 41/50
1/1 [==============================] - ETA: 0s - loss: 0.4260 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 46ms/step - loss: 0.4260 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5718 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 42/50
1/1 [==============================] - ETA: 0s - loss: 0.4240 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 40ms/step - loss: 0.4240 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5703 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 43/50
1/1 [==============================] - ETA: 0s - loss: 0.4225 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 40ms/step - loss: 0.4225 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5691 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 44/50
1/1 [==============================] - ETA: 0s - loss: 0.4212 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 43ms/step - loss: 0.4212 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5682 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 45/50
1/1 [==============================] - ETA: 0s - loss: 0.4203 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 40ms/step - loss: 0.4203 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5675 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 46/50
1/1 [==============================] - ETA: 0s - loss: 0.4196 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 39ms/step - loss: 0.4196 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5671 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 47/50
1/1 [==============================] - ETA: 0s - loss: 0.4191 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 46ms/step - loss: 0.4191 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5668 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 48/50
1/1 [==============================] - ETA: 0s - loss: 0.4188 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 39ms/step - loss: 0.4188 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5666 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 49/50
1/1 [==============================] - ETA: 0s - loss: 0.4186 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 40ms/step - loss: 0.4186 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5665 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 Epoch 50/50
1/1 [==============================] - ETA: 0s - loss: 0.4185 - AUPRC: 1.0000 - AUROC: 1.0000 1/1 [==============================] - 0s 39ms/step - loss: 0.4185 - AUPRC: 1.0000 - AUROC: 1.0000 - val_loss: 0.5665 - val_AUPRC: 1.0000 - val_AUROC: 1.0000 WARNING:absl:Importing a function (inference_BLOCK_4-2_ACT_1_layer_call_and_return_conditional_losses_49739) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-1_SE_CONV_1_layer_call_and_return_conditional_losses_21100) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-4_ACT_2_layer_call_and_return_conditional_losses_22243) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-5_ACT_2_layer_call_and_return_conditional_losses_49087) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_model_1_layer_call_and_return_conditional_losses_43539) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-5_SE_CONV_1_layer_call_and_return_conditional_losses_22447) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-1_ACT_1_layer_call_and_return_conditional_losses_44678) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-4_SE_CONV_1_layer_call_and_return_conditional_losses_22275) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-4_ACT_1_layer_call_and_return_conditional_losses_46900) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_4-1_ACT_1_layer_call_and_return_conditional_losses_49367) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-2_SE_CONV_1_layer_call_and_return_conditional_losses_49917) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-1_SE_CONV_1_layer_call_and_return_conditional_losses_21774) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-1_SE_CONV_1_layer_call_and_return_conditional_losses_47497) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-2_ACT_1_layer_call_and_return_conditional_losses_47691) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_4-3_ACT_1_layer_call_and_return_conditional_losses_50158) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-3_ACT_2_layer_call_and_return_conditional_losses_22071) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-1_ACT_2_layer_call_and_return_conditional_losses_21068) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-4_SE_CONV_1_layer_call_and_return_conditional_losses_23121) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-3_ACT_2_layer_call_and_return_conditional_losses_22917) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-1_SE_CONV_1_layer_call_and_return_conditional_losses_45868) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-3_SE_CONV_1_layer_call_and_return_conditional_losses_50336) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-2_SE_CONV_1_layer_call_and_return_conditional_losses_21931) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-2_ACT_2_layer_call_and_return_conditional_losses_22745) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-4_SE_CONV_1_layer_call_and_return_conditional_losses_21601) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-1_ACT_1_layer_call_and_return_conditional_losses_21025) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-2_SE_CONV_1_layer_call_and_return_conditional_losses_22777) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-3_ACT_1_layer_call_and_return_conditional_losses_22875) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_2-2_ACT_2_layer_call_and_return_conditional_losses_46201) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-3_ACT_2_layer_call_and_return_conditional_losses_21397) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-2_SE_CONV_1_layer_call_and_return_conditional_losses_47869) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-1_SE_CONV_1_layer_call_and_return_conditional_losses_49545) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-3_SE_CONV_1_layer_call_and_return_conditional_losses_21429) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-1_ACT_2_layer_call_and_return_conditional_losses_44817) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-3_SE_CONV_1_layer_call_and_return_conditional_losses_22103) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-1_ACT_2_layer_call_and_return_conditional_losses_47458) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_model_1_layer_call_and_return_conditional_losses_41786) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-3_SE_CONV_1_layer_call_and_return_conditional_losses_22949) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-2_ACT_1_layer_call_and_return_conditional_losses_44984) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-4_ACT_1_layer_call_and_return_conditional_losses_22201) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-3_ACT_1_layer_call_and_return_conditional_losses_46481) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_3-3_ACT_1_layer_call_and_return_conditional_losses_48110) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-2_ACT_2_layer_call_and_return_conditional_losses_49878) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-3_ACT_2_layer_call_and_return_conditional_losses_46620) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_1-2_ACT_2_layer_call_and_return_conditional_losses_45123) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-3_ACT_1_layer_call_and_return_conditional_losses_20907) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-1_ACT_2_layer_call_and_return_conditional_losses_22588) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-2_SE_CONV_1_layer_call_and_return_conditional_losses_46240) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-1_ACT_1_layer_call_and_return_conditional_losses_22545) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-4_ACT_1_layer_call_and_return_conditional_losses_21527) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_embeddings_13070) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-3_ACT_1_layer_call_and_return_conditional_losses_22029) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-5_ACT_1_layer_call_and_return_conditional_losses_48948) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_3-1_ACT_1_layer_call_and_return_conditional_losses_47319) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-1_ACT_1_layer_call_and_return_conditional_losses_20685) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-3_ACT_1_layer_call_and_return_conditional_losses_21355) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_3-2_ACT_2_layer_call_and_return_conditional_losses_21899) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-4_ACT_2_layer_call_and_return_conditional_losses_47039) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-5_SE_CONV_1_layer_call_and_return_conditional_losses_49126) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-1_ACT_2_layer_call_and_return_conditional_losses_45829) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-3_SE_CONV_1_layer_call_and_return_conditional_losses_48288) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_model_layer_call_and_return_conditional_losses_37532) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-4_ACT_2_layer_call_and_return_conditional_losses_48668) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-1_SE_CONV_1_layer_call_and_return_conditional_losses_22620) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-2_ACT_2_layer_call_and_return_conditional_losses_20831) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-4_ACT_1_layer_call_and_return_conditional_losses_50577) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-4_SE_CONV_1_layer_call_and_return_conditional_losses_47078) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-2_ACT_1_layer_call_and_return_conditional_losses_46062) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-2_ACT_1_layer_call_and_return_conditional_losses_21183) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_4-3_ACT_2_layer_call_and_return_conditional_losses_50297) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_basic_11033) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-4_ACT_2_layer_call_and_return_conditional_losses_21569) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_3-5_ACT_1_layer_call_and_return_conditional_losses_22373) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-2_ACT_2_layer_call_and_return_conditional_losses_21225) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-2_ACT_2_layer_call_and_return_conditional_losses_47830) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_model_layer_call_and_return_conditional_losses_39299) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-4_ACT_2_layer_call_and_return_conditional_losses_50716) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-3_ACT_2_layer_call_and_return_conditional_losses_48249) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-1_ACT_1_layer_call_and_return_conditional_losses_21699) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_4-4_ACT_2_layer_call_and_return_conditional_losses_23089) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-5_ACT_2_layer_call_and_return_conditional_losses_22415) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-1_ACT_2_layer_call_and_return_conditional_losses_49506) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_3-2_ACT_1_layer_call_and_return_conditional_losses_21857) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-2_ACT_1_layer_call_and_return_conditional_losses_20789) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-4_ACT_1_layer_call_and_return_conditional_losses_48529) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_4-2_ACT_1_layer_call_and_return_conditional_losses_22703) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-2_SE_CONV_1_layer_call_and_return_conditional_losses_21257) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inferencewrapped_model_15110) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-3_ACT_1_layer_call_and_return_conditional_losses_45337) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-4_SE_CONV_1_layer_call_and_return_conditional_losses_48707) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-3_ACT_2_layer_call_and_return_conditional_losses_45476) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_2-3_SE_CONV_1_layer_call_and_return_conditional_losses_46659) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_3-1_ACT_2_layer_call_and_return_conditional_losses_21742) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_1-3_ACT_2_layer_call_and_return_conditional_losses_20949) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-4_ACT_1_layer_call_and_return_conditional_losses_23047) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_4-4_SE_CONV_1_layer_call_and_return_conditional_losses_50755) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (inference_BLOCK_1-1_ACT_2_layer_call_and_return_conditional_losses_20728) with ops with unsaved custom gradients. Will likely fail if a gradient is requested. WARNING:absl:Importing a function (__inference_BLOCK_2-1_ACT_1_layer_call_and_return_conditional_losses_45690) with ops with unsaved custom gradients. Will likely fail if a gradient is requested.