I was able to get this to work with Tensorflow 2 on Windows 10 by converting a couple things for compatibility. However I'm stumped on one problem regarding the optimizer portion of the code. The Adam optimizer works fine with optimizer = tf.compat.v1.train.AdamOptimizer(args.learning_rate)
The L-BFGS-B optimizer does not seem to work with Tensorflow 2 as it's been removed. This is the part that needs to be migrated to work with Tensorflow 2
I looked into tfp.optimizer.lbfgs_minimize but I have no idea how to get it working in the existing code. Do you have any ideas? I am using Nvidia Ampere architecture cards (RTX 3090) that do not support Tensorflow 1.x due to CUDA compute version. Would really like to use L-BFGS instead of Adam for better results.
I was able to get this to work with Tensorflow 2 on Windows 10 by converting a couple things for compatibility. However I'm stumped on one problem regarding the optimizer portion of the code. The Adam optimizer works fine with
optimizer = tf.compat.v1.train.AdamOptimizer(args.learning_rate)
The L-BFGS-B optimizer does not seem to work with Tensorflow 2 as it's been removed. This is the part that needs to be migrated to work with Tensorflow 2
I looked into
tfp.optimizer.lbfgs_minimize
but I have no idea how to get it working in the existing code. Do you have any ideas? I am using Nvidia Ampere architecture cards (RTX 3090) that do not support Tensorflow 1.x due to CUDA compute version. Would really like to use L-BFGS instead of Adam for better results.