IDEA-Research / detrex

detrex is a research platform for DETR-based object detection, segmentation, pose estimation and other visual recognition tasks.
https://detrex.readthedocs.io/en/latest/
Apache License 2.0
2k stars 206 forks source link

customize #283

Open fade54188 opened 1 year ago

fade54188 commented 1 year ago

How to customize the optimizer and loss

fade54188 commented 1 year ago

I tried customizing the optimizer but failed

rentainhe commented 1 year ago

Hello, you can modify your optimizer through these ways:

1. Modify the optimizer through config file

You can modify it directly here: https://github.com/IDEA-Research/detrex/blob/main/configs/common/optim.py

For example, changing AdamW to Adam:

import torch

from detectron2.config import LazyCall as L
from detectron2.solver.build import get_default_optimizer_params

# define Adam Optimizer here
Adam = L(torch.optim.Adam)(
    params=L(get_default_optimizer_params)(
        # params.model is meant to be set to the model object, before instantiating
        # the optimizer.
        base_lr="${..lr}",
        weight_decay_norm=0.0,
    ),
    lr=1e-4,
    betas=(0.9, 0.999),
    weight_decay=0.1,
)

Then you can use it in your project config as:

from detrex.config import get_config

# get default config
optimizer = get_config("common/optim.py").Adam

2. Override it in your project config file

import torch

# get default config
optimizer = get_config("common/optim.py").AdamW

optimizer._target_ = torch.optim.Adam

3. Define it with optimizer namespace in your project config

import torch

from detectron2.config import LazyCall as L
from detectron2.solver.build import get_default_optimizer_params

# Using optimizer namespace
optimizer = L(torch.optim.Adam)(
    params=L(get_default_optimizer_params)(
        # params.model is meant to be set to the model object, before instantiating
        # the optimizer.
        base_lr="${..lr}",
        weight_decay_norm=0.0,
    ),
    lr=1e-4,
    betas=(0.9, 0.999),
    weight_decay=0.1,
)