Open MeowTheCat opened 2 months ago
just as an idea, maybe this could help (custom loop.):
loss_fn = keras.losses.CategoricalCrossentropy(from_logits=True)
optimizer = keras.optimizers.Adam()
other_args = [1,2,3]
epoch_results = [ ]
for step, (x, y) in enumerate(dataset):
with tf.GradientTape() as tape:
logits = model(x)
loss_value = loss_fn(y, logits)
epoch_results+=my_metric(y, logits,other_args) # <----
gradients = tape.gradient(loss_value, model.trainable_weights)
optimizer.apply_gradients(zip(gradients, model.trainable_weights))
print(sum(epoch_results)/len(epoch_results))
Previously Keras 2 I can use model.add_metric() to add any arbitrary tensor as my metric.
But Keras 3 removed .add_metric() method from layer and model. How can I achieve that?
BTW I can't subclass keras.metrics.Metric. because my metric tensor calculation require more tensors than y_true, y_pred.
Thanks