As documented in Issue #19262 on the Keras GitHub repository, there was a conflict between the Transformers and Keras libraries. Specifically, executing the following code resulted in an error associated with the Adam optimizer:
import tensorflow as tf
model = TFAutoModelForSequenceClassification.from_pretrained('checkpoint', num_labels=2)
loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
model.compile(optimizer='opt', loss=loss, metrics=["accuracy"])
I have updated the notebook to resolve this issue. With the modifications applied, the error no longer occurs, ensuring compatibility between the two libraries.
As documented in Issue #19262 on the Keras GitHub repository, there was a conflict between the Transformers and Keras libraries. Specifically, executing the following code resulted in an error associated with the
Adam
optimizer:I have updated the notebook to resolve this issue. With the modifications applied, the error no longer occurs, ensuring compatibility between the two libraries.