LeapLabTHU / Rank-DETR

[NeurIPS 2023] Rank-DETR for High Quality Object Detection
Apache License 2.0
87 stars 7 forks source link

TypeError: 'NoneType' object is not subscriptable #3

Closed yhj-1 closed 4 months ago

yhj-1 commented 4 months ago

This is an issue that occurred during my training of rank_detr_r50_50ep. The problem is class_imbed=None, but I was able to use rank_detr_r50_two_stage_12ep because class_imbed is assigned a value. The other two statuses are all fine. I would like to ask the author if they forgot the assignment of class-embed in rank_detr_r50_50ep?

Traceback (most recent call last): File "tools/train_net.py", line 307, in launch( File "/root/detrex/detectron2/detectron2/engine/launch.py", line 84, in launch main_func(args) File "tools/train_net.py", line 302, in main do_train(args, cfg) File "tools/train_net.py", line 275, in do_train trainer.train(start_iter, cfg.train.max_iter) File "/root/detrex/detectron2/detectron2/engine/train_loop.py", line 155, in train self.run_step()###出错点 File "tools/train_net.py", line 101, in run_step loss_dict = self.model(data) File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl return forward_call(input, kwargs) File "/root/detrex/projects/rank_detr/modeling/rank_detr.py", line 241, in forward ) = self.transformer(###出错点 File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl return forward_call(*input, *kwargs) File "/root/detrex/projects/rank_detr/modeling/rank_transformer.py", line 557, in forward inter_states, inter_references = self.decoder(####出错点 File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl return forward_call(input, kwargs) File "/root/detrex/projects/rank_detr/modeling/rank_transformer.py", line 279, in forward outputs_class_tmp = self.class_embedlayer_idx # [bs, num_queries, embed_dim] -> [bs, num_queries, num_classes] TypeError: 'NoneType' object is not subscriptable

yifanpu001 commented 4 months ago

@yhj-1 please refer to https://github.com/LeapLabTHU/Rank-DETR/issues/1#issuecomment-1770328735

yhj-1 commented 4 months ago

Regarding problem 2, I haven't encountered the same situation. Although in line 167 of rank_transformer.py, there is a "self.class_embed = None" in the init method of RankDetrTransformerDecoder, it will be rewrite by some Linear head in rank_detr.py line 148. Could you please provide more details to reproduce this error?

This explanation is a bit problematic.: Although "self. class_imbed" will be overridden by a Linear header in line 148 of rank_detr. py, its condition is that it must be "as two stages=True". I am training the "rank_detr_r50_50ep. py" configuration file and do not need "as_two_stage". This problem is still unresolved

yifanpu001 commented 4 months ago

Hi @yhj-1, a simple answer could be, you should not use "rank_detr_r50_50ep.py". This is a base config file for others to inherit. To reproduce the results in the paper, you should use other config files in projects/rank_detr/configs/

yhj-1 commented 4 months ago

I understand what you're saying. I got it.Thank you