tinyvision / DAMO-YOLO

DAMO-YOLO: a fast and accurate object detection method with some new techs, including NAS backbones, efficient RepGFPN, ZeroHead, AlignedOTA, and distillation enhancement.
Apache License 2.0
3.75k stars 470 forks source link

About Categories model #105

Open ytzfhqs opened 1 year ago

ytzfhqs commented 1 year ago

Before Asking

Search before asking

Question

我想使用阅读文档中的通用检测模型(80-categories-DAMO-YOLO-S701-categories-DAMO-YOLO-S)但在运行demo.py文件时发现configs文件夹中没有对应配置文件,我尝试使用damoyolo_tinynasL25_S.py文件作为配置文件使用80-categories-DAMO-YOLO-S,但是出现报错:

Inference with torch engine!
2023-05-11 23:17:45.885 | ERROR    | __main__:<module>:367 - An error has been caught in function '<module>', process 'MainProcess' (7828), thread 'MainThread' (13596):
Traceback (most recent call last):

> File "C:\Users\lenovo\Desktop\DAMO-YOLO-master\tools\demo.py", line 367, in <module>
    main()
    └ <function main at 0x000001F5DC15ED30>

  File "C:\Users\lenovo\Desktop\DAMO-YOLO-master\tools\demo.py", line 326, in main
    infer_engine = Infer(config, infer_size=args.infer_size, device=args.device,
                   │     │                  │    │                  │    └ 'cpu'
                   │     │                  │    │                  └ Namespace(input_type='image', config_file='./configs/damoyolo_tinynasL25_S.py', path='./assets/dog.jpg', camid=0, engine='./d...
                   │     │                  │    └ [640, 640]
                   │     │                  └ Namespace(input_type='image', config_file='./configs/damoyolo_tinynasL25_S.py', path='./assets/dog.jpg', camid=0, engine='./d...
                   │     └ ╒═════════╤═══════════════════════════════════════════════════════════════════════════════════════════════════╕
                   │       │ keys    │ v...
                   └ <class '__main__.Infer'>

  File "C:\Users\lenovo\Desktop\DAMO-YOLO-master\tools\demo.py", line 54, in __init__
    self.model = self._build_engine(self.config, self.engine_type)
    │            │    │             │    │       │    └ 'torch'
    │            │    │             │    │       └ <__main__.Infer object at 0x000001F5DC1673D0>
    │            │    │             │    └ ╒═════════╤═══════════════════════════════════════════════════════════════════════════════════════════════════╕
    │            │    │             │      │ keys    │ v...
    │            │    │             └ <__main__.Infer object at 0x000001F5DC1673D0>
    │            │    └ <function Infer._build_engine at 0x000001F5DC15E8B0>
    │            └ <__main__.Infer object at 0x000001F5DC1673D0>
    └ <__main__.Infer object at 0x000001F5DC1673D0>

  File "C:\Users\lenovo\Desktop\DAMO-YOLO-master\tools\demo.py", line 76, in _build_engine
    model.load_state_dict(ckpt['model'], strict=True)
    │     │               └ {'model': OrderedDict([('backbone.block_list.0.conv.conv.weight', tensor([[[[-2.9148e-03,  8.9314e-03, -8.4367e-03],
    │     │                         ...
    │     └ <function Module.load_state_dict at 0x000001F5D81AB310>
    └ Detector(
        (backbone): TinyNAS(
          (block_list): ModuleList(
            (0): Focus(
              (conv): ConvBNAct(
                (conv):...

  File "D:\Python\lib\site-packages\torch\nn\modules\module.py", line 1482, in load_state_dict
    raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(

RuntimeError: Error(s) in loading state_dict for Detector:
    Unexpected key(s) in state_dict: "backbone.block_list.1.block_list.0.residual_proj.conv1.weight", "backbone.block_list.1.block_list.0.residual_proj.bn1.weight", "backbone.block_list.1.block_list.0.residual_proj.bn1.bias", "backbone.block_list.1.block_list.0.residual_proj.bn1.running_mean", "backbone.block_list.1.block_list.0.residual_proj.bn1.running_var", "backbone.block_list.1.block_list.0.residual_proj.bn1.num_batches_tracked", "backbone.block_list.3.block_list.0.residual_proj.conv1.weight", "backbone.block_list.3.block_list.0.residual_proj.bn1.weight", "backbone.block_list.3.block_list.0.residual_proj.bn1.bias", "backbone.block_list.3.block_list.0.residual_proj.bn1.running_mean", "backbone.block_list.3.block_list.0.residual_proj.bn1.running_var", "backbone.block_list.3.block_list.0.residual_proj.bn1.num_batches_tracked", "backbone.block_list.5.block_list.0.residual_proj.conv1.weight", "backbone.block_list.5.block_list.0.residual_proj.bn1.weight", "backbone.block_list.5.block_list.0.residual_proj.bn1.bias", "backbone.block_list.5.block_list.0.residual_proj.bn1.running_mean", "backbone.block_list.5.block_list.0.residual_proj.bn1.running_var", "backbone.block_list.5.block_list.0.residual_proj.bn1.num_batches_tracked". 
    size mismatch for head.gfl_cls.0.weight: copying a param with shape torch.Size([81, 128, 3, 3]) from checkpoint, the shape in current model is torch.Size([80, 128, 3, 3]).
    size mismatch for head.gfl_cls.0.bias: copying a param with shape torch.Size([81]) from checkpoint, the shape in current model is torch.Size([80]).
    size mismatch for head.gfl_cls.1.weight: copying a param with shape torch.Size([81, 256, 3, 3]) from checkpoint, the shape in current model is torch.Size([80, 256, 3, 3]).
    size mismatch for head.gfl_cls.1.bias: copying a param with shape torch.Size([81]) from checkpoint, the shape in current model is torch.Size([80]).
    size mismatch for head.gfl_cls.2.weight: copying a param with shape torch.Size([81, 512, 3, 3]) from checkpoint, the shape in current model is torch.Size([80, 512, 3, 3]).
    size mismatch for head.gfl_cls.2.bias: copying a param with shape torch.Size([81]) from checkpoint, the shape in current model is torch.Size([80]).

期待该问题得到解决。

Additional

No response

KotovNikitaStudent commented 1 year ago

Try to use load_state_dict(torch.load("your_model.pth"), strict=False).