ZZZ429 / CC-DETR

1 stars 0 forks source link

train help #2

Open YZGod666 opened 1 month ago

YZGod666 commented 1 month ago

Traceback (most recent call last): File "I:/Code/CC-DETR-main/Networks/ALTGVT1.py", line 596, in model = alt_gvt_large(pretrained=True) File "I:/Code/CC-DETR-main/Networks/ALTGVT1.py", line 586, in alt_gvt_large *kwargs) File "I:/Code/CC-DETR-main/Networks/ALTGVT1.py", line 531, in init norm_layer, depths, sr_ratios, block_cls) File "I:/Code/CC-DETR-main/Networks/ALTGVT1.py", line 518, in init norm_layer, depths, sr_ratios, block_cls) File "I:/Code/CC-DETR-main/Networks/ALTGVT1.py", line 432, in init sr_ratios, block_cls) File "I:/Code/CC-DETR-main/Networks/ALTGVT1.py", line 329, in init for i in range(depths[k])]) File "I:/Code/CC-DETR-main/Networks/ALTGVT1.py", line 329, in for i in range(depths[k])]) File "I:/Code/CC-DETR-main/Networks/ALTGVT1.py", line 250, in init drop_path, act_layer, norm_layer) File "D:\Annconda3\envs\swin\lib\site-packages\timm\models\vision_transformer.py", line 150, in init self.ls1 = LayerScale(dim, init_values=init_values) if init_values else nn.Identity() File "D:\Annconda3\envs\swin\lib\site-packages\timm\models\vision_transformer.py", line 117, in init self.gamma = nn.Parameter(init_values torch.ones(dim)) TypeError: unsupported operand type(s) for *: 'type' and 'Tensor'

进程已结束,退出代码1

ZZZ429 commented 1 month ago

跟新 timm库版本

YZGod666 commented 1 month ago

我在更新版本以后运行成功了,但是还是有其他问题 Traceback (most recent call last): File "train.py", line 68, in trainer.train() File "I:\Code\CC-DETR-main\train_net.py", line 181, in train self.train_epoch() File "I:\Code\CC-DETR-main\train_net.py", line 205, in train_epoch outputs, outputs_normed = self.model(inputs) File "D:\Annconda3\envs\swin\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl return forward_call(*input, kwargs) File "I:\Code\CC-DETR-main\Networks\ALTGVT1.py", line 502, in forward out= self.detr(x[3],None,x[1].flatten(2).permute(2, 0, 1),self.pos_embed) File "D:\Annconda3\envs\swin\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl return forward_call(*input, *kwargs) File "I:\Code\CC-DETR-main\Networks\transformer.py", line 56, in forward memory = self.encoder(src, src_key_padding_mask=mask, pos=pos_embed) File "D:\Annconda3\envs\swin\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl return forward_call(input, kwargs) File "I:\Code\CC-DETR-main\Networks\transformer.py", line 81, in forward src_key_padding_mask=src_key_padding_mask, pos=pos) File "D:\Annconda3\envs\swin\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "I:\Code\CC-DETR-main\Networks\transformer.py", line 187, in forward return self.forward_post(src, src_mask, src_key_padding_mask, pos) File "I:\Code\CC-DETR-main\Networks\transformer.py", line 157, in forward_post q = k = self.with_pos_embed(src, pos) File "I:\Code\CC-DETR-main\Networks\transformer.py", line 150, in with_pos_embed return tensor if pos is None else tensor + pos RuntimeError: The size of tensor a (64) must match the size of tensor b (144) at non-singleton dimension 0