Open fade54188 opened 1 year ago
I tried customizing the optimizer but failed
Hello, you can modify your optimizer through these ways:
1. Modify the optimizer through config file
You can modify it directly here: https://github.com/IDEA-Research/detrex/blob/main/configs/common/optim.py
For example, changing AdamW
to Adam
:
import torch
from detectron2.config import LazyCall as L
from detectron2.solver.build import get_default_optimizer_params
# define Adam Optimizer here
Adam = L(torch.optim.Adam)(
params=L(get_default_optimizer_params)(
# params.model is meant to be set to the model object, before instantiating
# the optimizer.
base_lr="${..lr}",
weight_decay_norm=0.0,
),
lr=1e-4,
betas=(0.9, 0.999),
weight_decay=0.1,
)
Then you can use it in your project config as:
from detrex.config import get_config
# get default config
optimizer = get_config("common/optim.py").Adam
2. Override it in your project config file
import torch
# get default config
optimizer = get_config("common/optim.py").AdamW
optimizer._target_ = torch.optim.Adam
3. Define it with optimizer namespace in your project config
import torch
from detectron2.config import LazyCall as L
from detectron2.solver.build import get_default_optimizer_params
# Using optimizer namespace
optimizer = L(torch.optim.Adam)(
params=L(get_default_optimizer_params)(
# params.model is meant to be set to the model object, before instantiating
# the optimizer.
base_lr="${..lr}",
weight_decay_norm=0.0,
),
lr=1e-4,
betas=(0.9, 0.999),
weight_decay=0.1,
)
How to customize the optimizer and loss