if scaler is not None:
with torch.autocast(device_type=str(device), cache_enabled=True):
outputs = model(samples, targets=targets)
with torch.autocast(device_type=str(device), enabled=False):
loss_dict = criterion(outputs, targets, **metas)
There is any reason that always enabled flag is False? is that bug?
i assume that AMP logic is don't work even if use_ema=True. because torch.autocast enabled flag is always False
https://github.com/lyuwenyu/RT-DETR/blob/main/rtdetrv2_pytorch/src/solver/det_engine.py#L48
There is any reason that always enabled flag is False? is that bug?