Open SamMohel opened 7 months ago
I tired to follow the example of using MAGRAD optimizer
import torch_optimizer as optim optimizer = optim.MAGRAD(model.parameters(), lr=0.1) optimizer.zero_grad() loss_fn(model(input), target).backward() optimizer.step()
but got AttributeError: module 'torch_optimizer' has no attribute 'MAGRAD'
AttributeError: module 'torch_optimizer' has no attribute 'MAGRAD'
I tired to follow the example of using MAGRAD optimizer
but got
AttributeError: module 'torch_optimizer' has no attribute 'MAGRAD'