lu-group / deeponet-fno

DeepONet & FNO (with practical extensions)
Other
202 stars 41 forks source link

Run Darcy flow with adamw specific optimizer not implemented error #14

Closed BraveDrXuTF closed 9 months ago

BraveDrXuTF commented 9 months ago

NotImplementedError: <tensorflow_addons.optimizers.weight_decay_optimizers.AdamW object at 0x00000136F434F6D0> to be implemented for backend tensorflow.compat.v1. in src\darcy_rectangular_pwc\deeponet.py.

    model.compile(
        tfa.optimizers.AdamW(1e-4, learning_rate=3e-4),
        lr=3e-4,
        decay=("inverse time", 1, 1e-4),
        metrics=["mean l2 relative error"],
    )

where I have add lr=3e-4, if I use the original code

    model.compile(
        tfa.optimizers.AdamW(1e-4, learning_rate=3e-4),
        decay=("inverse time", 1, 1e-4),
        metrics=["mean l2 relative error"],
    )

I got anther error ValueError: No learning rate for <tensorflow_addons.optimizers.weight_decay_optimizers.AdamW object at 0x00000284AC2BF100>.

lululxvi commented 9 months ago

Use tensorflow 2.x backend.

BraveDrXuTF commented 9 months ago

I have solved it. Thank you!