Closed ghost closed 5 years ago
Thanks for the report. I can reproduce the same error withtensorflow 1.13.1
. Updating to tensorflow 1.14.0
solved the problem. Here're the complete installation commands:
conda create -n pde python=3.6
conda activate pde
pip install tensorflow==1.14.0
pip install git+git//github.com/google-research/data-driven-pdes
Alternatively, you can just drop the metrics
argument completely (it won't affect training), and simply use optimizer='adam'
(the original code just uses the default learning rate).
Thank you, @JiaweiZhuang . Changing the version of tensorflow as suggested makes it move past those two errors. However, before changing the version of tensorflow, I tried the alternate option you suggested, which although solves the issue of metrics
argument, the second error I mentioned earlier still remains.
the second error I mentioned earlier still remains.
Ah, you are right. I also got AttributeError: 'Adam' object has no attribute 'apply_gradients'
when using optimizer='adam'
with tensorflow 1.13.1
.
Maybe this is related to the migration to TF 2.0 (e.g. tensorflow/tensorflow#27386). We should probably require tensorflow==1.14.0
for now.
Thank you.
I am getting errors while running the
advection_1d
example given in this python notebook. The version I have for tensorflow is same as the version shown in that example:The error I am getting is at the following line, just before neural network is trained with the data:
I checked the class
tf.keras.metrics
and it does not have any function calledRootMeanSquaredError
, which is the reason for error. After some search, I modified themetrics
in above line throwing the error asmetrics=["RootMeanSquaredError"]
, which helps get rid of that particular error, but I get another error on the next line as follows:The above error does not go away whether I enable the eager execution or not. I was curious how the example in that notebook ran with the exact same version of tensorflow that I am also using with Python 3.6. I would appreciate your help with this.