lyuwenyu / RT-DETR

[CVPR 2024] Official RT-DETR (RTDETR paddle pytorch), Real-Time DEtection TRansformer, DETRs Beat YOLOs on Real-time Object Detection. 🔥 🔥 🔥
Apache License 2.0
2.61k stars 303 forks source link

there is any reason torch.autocast(enabled=False) in training? #423

Closed int11 closed 2 months ago

int11 commented 2 months ago

i assume that AMP logic is don't work even if use_ema=True. because torch.autocast enabled flag is always False

https://github.com/lyuwenyu/RT-DETR/blob/main/rtdetrv2_pytorch/src/solver/det_engine.py#L48

if scaler is not None:
    with torch.autocast(device_type=str(device), cache_enabled=True):
    outputs = model(samples, targets=targets)

    with torch.autocast(device_type=str(device), enabled=False):
        loss_dict = criterion(outputs, targets, **metas)

There is any reason that always enabled flag is False? is that bug?