I wonder how can I use optimizer as my loss function. I change the optimizer type to adam but the adam optimizer requires beta values. When I define beta values as "betas", the runx gives error as:
train.py: error: unrecognized arguments: --betas 0.9
I wonder how can I use optimizer as my loss function. I change the optimizer type to adam but the adam optimizer requires beta values. When I define beta values as "betas", the runx gives error as: train.py: error: unrecognized arguments: --betas 0.9
Any help about this?