Open j-woz opened 2 weeks ago
Using params["learning_rate"] directly feels like a more solid and future-proof way to go since it keeps the code in line with newer TensorFlow versions and avoids relying on stuff that’s been removed. What do you think about adding version checks to handle both TensorFlow 2.15 and 2.16? That way, we could switch between optimizer.learning_rate and params["learning_rate"] as needed.
I'll test out and create a PR and ref. this issue
optimizer.lr disappeared from the Keras API in TF 2.16 , breaking Uno. This seems to work for TF 2.15 and 2.16 .
Should we do it like this or simply refer to params["learning_rate"] ?