JDACS4C-IMPROVE / UNO

Standalone UNO repo for the IMPROVE project
0 stars 1 forks source link

Support for TF 2.16 #2

Open j-woz opened 2 weeks ago

j-woz commented 2 weeks ago

optimizer.lr disappeared from the Keras API in TF 2.16 , breaking Uno. This seems to work for TF 2.15 and 2.16 .

Should we do it like this or simply refer to params["learning_rate"] ?

rajeeja commented 2 days ago

Using params["learning_rate"] directly feels like a more solid and future-proof way to go since it keeps the code in line with newer TensorFlow versions and avoids relying on stuff that’s been removed. What do you think about adding version checks to handle both TensorFlow 2.15 and 2.16? That way, we could switch between optimizer.learning_rate and params["learning_rate"] as needed.

rajeeja commented 2 days ago

I'll test out and create a PR and ref. this issue