MIV-XJTU / ARTrack

Apache License 2.0
228 stars 33 forks source link

关于测试 #45

Closed 123wfl closed 6 months ago

123wfl commented 6 months ago

单卡测试时报错 Tracker: artrack_seq artrack_seq_256_full None , Sequence: 213 test config: {'MODEL': {'PRETRAIN_FILE': 'mae_pretrain_vit_base.pth', 'PRETRAIN_PTH': '', 'PRENUM': 7, 'EXTRA_MERGER': False, 'RETURN_INTER': False, 'RETURN_STAGES': [2, 5, 8, 11], 'BACKBONE': {'TYPE': 'vit_base_patch16_224', 'STRIDE': 16, 'MID_PE': False, 'SEP_SEG': False, 'CAT_MODE': 'direct', 'MERGE_LAYER': 0, 'ADD_CLS_TOKEN': False, 'CLS_TOKEN_USE_MODE': 'ignore', 'CE_LOC': [], 'CE_KEEP_RATIO': [], 'CE_TEMPLATE_RANGE': 'ALL'}, 'BINS': 400, 'RANGE': 2, 'ENCODER_LAYER': 3, 'NUM_HEADS': 12, 'MLP_RATIO': 4, 'QKV_BIAS': True, 'DROP_RATE': 0.1, 'ATTN_DROP': 0.0, 'DROP_PATH': 0.0, 'DECODER_LAYER': 6, 'HEAD': {'TYPE': 'PIX', 'NUM_CHANNELS': 768}}, 'TRAIN': {'LR': 4e-06, 'WEIGHT_DECAY': 0.05, 'EPOCH': 60, 'LR_DROP_EPOCH': 999, 'BATCH_SIZE': 8, 'NUM_WORKER': 4, 'OPTIMIZER': 'ADAMW', 'BACKBONE_MULTIPLIER': 0.1, 'GIOU_WEIGHT': 2.0, 'L1_WEIGHT': 0.0, 'FREEZE_LAYERS': [0], 'PRINT_INTERVAL': 1, 'VAL_EPOCH_INTERVAL': 10, 'GRAD_CLIP_NORM': 0.1, 'AMP': False, 'CE_START_EPOCH': 20, 'CE_WARM_EPOCH': 80, 'DROP_PATH_RATE': 0.1, 'SCHEDULER': {'TYPE': 'step', 'DECAY_RATE': 0.1}}, 'DATA': {'SAMPLER_MODE': 'causal', 'MEAN': [0.485, 0.456, 0.406], 'STD': [0.229, 0.224, 0.225], 'MAX_SAMPLE_INTERVAL': 200, 'MAX_GAP': 300, 'MAX_INTERVAL': 5, 'INTERVAL_PROB': 0.0, 'TEMP': 2, 'TRAIN': {'DATASETS_NAME': ['LASOT', 'GOT10K_vottrain', 'TRACKINGNET'], 'DATASETS_RATIO': [1, 1, 1], 'SAMPLE_PER_EPOCH': 1000}, 'VAL': {'DATASETS_NAME': ['GOT10K_official_val'], 'DATASETS_RATIO': [1], 'SAMPLE_PER_EPOCH': 10000}, 'SEARCH': {'SIZE': 64, 'FACTOR': 4.0, 'CENTER_JITTER': 3, 'SCALE_JITTER': 0.25, 'NUMBER': 36}, 'TEMPLATE': {'NUMBER': 1, 'SIZE': 128, 'FACTOR': 2.0, 'CENTER_JITTER': 0, 'SCALE_JITTER': 0}}, 'TEST': {'TEMPLATE_FACTOR': 2.0, 'TEMPLATE_SIZE': 128, 'SEARCH_FACTOR': 4.0, 'SEARCH_SIZE': 256, 'EPOCH': 60}} 4 4 4 4 4 4 [Errno 2] No such file or directory: ''

ARTrackV2 commented 6 months ago

您可以提供我您推理的指令,我可以帮您解决您的问题

123wfl commented 6 months ago

python tracking/test.py artrack_seq artrack_seq_256_full --dataset lasot --threads 0 --num_gpus 1 当我输入这样的命令时,就会报上面的错误

ARTrackV2 commented 6 months ago

lasot的话,你前面几个sequence不报错吗?那有没有可能是数据没了

123wfl commented 6 months ago

全部报错

ARTrackV2 commented 6 months ago

我看你的config,你似乎yaml文件里没填pretrain的pth的地址,你改一下试试?

123wfl commented 6 months ago

Tracker: artrack_seq artrack_seq_256_full None , Sequence: 238 test config: {'MODEL': {'PRETRAIN_FILE': 'mae_pretrain_vit_base.pth', 'PRETRAIN_PTH': '/home/lsw/wfl/ARTrack-main/pretrained_models/mae_pretrain_vit_base.pth', 'PRENUM': 7, 'EXTRA_MERGER': False, 'RETURN_INTER': False, 'RETURN_STAGES': [2, 5, 8, 11], 'BACKBONE': {'TYPE': 'vit_base_patch16_224', 'STRIDE': 16, 'MID_PE': False, 'SEP_SEG': False, 'CAT_MODE': 'direct', 'MERGE_LAYER': 0, 'ADD_CLS_TOKEN': False, 'CLS_TOKEN_USE_MODE': 'ignore', 'CE_LOC': [], 'CE_KEEP_RATIO': [], 'CE_TEMPLATE_RANGE': 'ALL'}, 'BINS': 400, 'RANGE': 2, 'ENCODER_LAYER': 3, 'NUM_HEADS': 12, 'MLP_RATIO': 4, 'QKV_BIAS': True, 'DROP_RATE': 0.1, 'ATTN_DROP': 0.0, 'DROP_PATH': 0.0, 'DECODER_LAYER': 6, 'HEAD': {'TYPE': 'PIX', 'NUM_CHANNELS': 768}}, 'TRAIN': {'LR': 4e-06, 'WEIGHT_DECAY': 0.05, 'EPOCH': 60, 'LR_DROP_EPOCH': 999, 'BATCH_SIZE': 8, 'NUM_WORKER': 4, 'OPTIMIZER': 'ADAMW', 'BACKBONE_MULTIPLIER': 0.1, 'GIOU_WEIGHT': 2.0, 'L1_WEIGHT': 0.0, 'FREEZE_LAYERS': [0], 'PRINT_INTERVAL': 1, 'VAL_EPOCH_INTERVAL': 10, 'GRAD_CLIP_NORM': 0.1, 'AMP': False, 'CE_START_EPOCH': 20, 'CE_WARM_EPOCH': 80, 'DROP_PATH_RATE': 0.1, 'SCHEDULER': {'TYPE': 'step', 'DECAY_RATE': 0.1}}, 'DATA': {'SAMPLER_MODE': 'causal', 'MEAN': [0.485, 0.456, 0.406], 'STD': [0.229, 0.224, 0.225], 'MAX_SAMPLE_INTERVAL': 200, 'MAX_GAP': 300, 'MAX_INTERVAL': 5, 'INTERVAL_PROB': 0.0, 'TEMP': 2, 'TRAIN': {'DATASETS_NAME': ['LASOT', 'GOT10K_vottrain', 'TRACKINGNET'], 'DATASETS_RATIO': [1, 1, 1], 'SAMPLE_PER_EPOCH': 1000}, 'VAL': {'DATASETS_NAME': ['GOT10K_official_val'], 'DATASETS_RATIO': [1], 'SAMPLE_PER_EPOCH': 10000}, 'SEARCH': {'SIZE': 64, 'FACTOR': 4.0, 'CENTER_JITTER': 3, 'SCALE_JITTER': 0.25, 'NUMBER': 36}, 'TEMPLATE': {'NUMBER': 1, 'SIZE': 128, 'FACTOR': 2.0, 'CENTER_JITTER': 0, 'SCALE_JITTER': 0}}, 'TEST': {'TEMPLATE_FACTOR': 2.0, 'TEMPLATE_SIZE': 128, 'SEARCH_FACTOR': 4.0, 'SEARCH_SIZE': 256, 'EPOCH': 60}} 4 4 4 'net' Tracker: artrack_seq artrack_seq_256_full None , Sequence: 40 test config: {'MODEL': {'PRETRAIN_FILE': 'mae_pretrain_vit_base.pth', 'PRETRAIN_PTH': '/home/lsw/wfl/ARTrack-main/pretrained_models/mae_pretrain_vit_base.pth', 'PRENUM': 7, 'EXTRA_MERGER': False, 'RETURN_INTER': False, 'RETURN_STAGES': [2, 5, 8, 11], 'BACKBONE': {'TYPE': 'vit_base_patch16_224', 'STRIDE': 16, 'MID_PE': False, 'SEP_SEG': False, 'CAT_MODE': 'direct', 'MERGE_LAYER': 0, 'ADD_CLS_TOKEN': False, 'CLS_TOKEN_USE_MODE': 'ignore', 'CE_LOC': [], 'CE_KEEP_RATIO': [], 'CE_TEMPLATE_RANGE': 'ALL'}, 'BINS': 400, 'RANGE': 2, 'ENCODER_LAYER': 3, 'NUM_HEADS': 12, 'MLP_RATIO': 4, 'QKV_BIAS': True, 'DROP_RATE': 0.1, 'ATTN_DROP': 0.0, 'DROP_PATH': 0.0, 'DECODER_LAYER': 6, 'HEAD': {'TYPE': 'PIX', 'NUM_CHANNELS': 768}}, 'TRAIN': {'LR': 4e-06, 'WEIGHT_DECAY': 0.05, 'EPOCH': 60, 'LR_DROP_EPOCH': 999, 'BATCH_SIZE': 8, 'NUM_WORKER': 4, 'OPTIMIZER': 'ADAMW', 'BACKBONE_MULTIPLIER': 0.1, 'GIOU_WEIGHT': 2.0, 'L1_WEIGHT': 0.0, 'FREEZE_LAYERS': [0], 'PRINT_INTERVAL': 1, 'VAL_EPOCH_INTERVAL': 10, 'GRAD_CLIP_NORM': 0.1, 'AMP': False, 'CE_START_EPOCH': 20, 'CE_WARM_EPOCH': 80, 'DROP_PATH_RATE': 0.1, 'SCHEDULER': {'TYPE': 'step', 'DECAY_RATE': 0.1}}, 'DATA': {'SAMPLER_MODE': 'causal', 'MEAN': [0.485, 0.456, 0.406], 'STD': [0.229, 0.224, 0.225], 'MAX_SAMPLE_INTERVAL': 200, 'MAX_GAP': 300, 'MAX_INTERVAL': 5, 'INTERVAL_PROB': 0.0, 'TEMP': 2, 'TRAIN': {'DATASETS_NAME': ['LASOT', 'GOT10K_vottrain', 'TRACKINGNET'], 'DATASETS_RATIO': [1, 1, 1], 'SAMPLE_PER_EPOCH': 1000}, 'VAL': {'DATASETS_NAME': ['GOT10K_official_val'], 'DATASETS_RATIO': [1], 'SAMPLE_PER_EPOCH': 10000}, 'SEARCH': {'SIZE': 64, 'FACTOR': 4.0, 'CENTER_JITTER': 3, 'SCALE_JITTER': 0.25, 'NUMBER': 36}, 'TEMPLATE': {'NUMBER': 1, 'SIZE': 128, 'FACTOR': 2.0, 'CENTER_JITTER': 0, 'SCALE_JITTER': 0}}, 'TEST': {'TEMPLATE_FACTOR': 2.0, 'TEMPLATE_SIZE': 128, 'SEARCH_FACTOR': 4.0, 'SEARCH_SIZE': 256, 'EPOCH': 60}} 4 4 4 'net'

ARTrackV2 commented 6 months ago

PRETRAIN_FILE那里填训练好的模型,不是填MAE那个

123wfl commented 6 months ago

Tracker: artrack_seq artrack_seq_256_full None , Sequence: 213 test config: {'MODEL': {'PRETRAIN_FILE': 'mae_pretrain_vit_base.pth', 'PRETRAIN_PTH': '/home/lsw/wfl/2/ARTrack/ARTrackSeq_ep0060.pth.tar', 'PRENUM': 7, 'EXTRA_MERGER': False, 'RETURN_INTER': False, 'RETURN_STAGES': [2, 5, 8, 11], 'BACKBONE': {'TYPE': 'vit_base_patch16_224', 'STRIDE': 16, 'MID_PE': False, 'SEP_SEG': False, 'CAT_MODE': 'direct', 'MERGE_LAYER': 0, 'ADD_CLS_TOKEN': False, 'CLS_TOKEN_USE_MODE': 'ignore', 'CE_LOC': [], 'CE_KEEP_RATIO': [], 'CE_TEMPLATE_RANGE': 'ALL'}, 'BINS': 400, 'RANGE': 2, 'ENCODER_LAYER': 3, 'NUM_HEADS': 12, 'MLP_RATIO': 4, 'QKV_BIAS': True, 'DROP_RATE': 0.1, 'ATTN_DROP': 0.0, 'DROP_PATH': 0.0, 'DECODER_LAYER': 6, 'HEAD': {'TYPE': 'PIX', 'NUM_CHANNELS': 768}}, 'TRAIN': {'LR': 4e-06, 'WEIGHT_DECAY': 0.05, 'EPOCH': 60, 'LR_DROP_EPOCH': 999, 'BATCH_SIZE': 8, 'NUM_WORKER': 4, 'OPTIMIZER': 'ADAMW', 'BACKBONE_MULTIPLIER': 0.1, 'GIOU_WEIGHT': 2.0, 'L1_WEIGHT': 0.0, 'FREEZE_LAYERS': [0], 'PRINT_INTERVAL': 1, 'VAL_EPOCH_INTERVAL': 10, 'GRAD_CLIP_NORM': 0.1, 'AMP': False, 'CE_START_EPOCH': 20, 'CE_WARM_EPOCH': 80, 'DROP_PATH_RATE': 0.1, 'SCHEDULER': {'TYPE': 'step', 'DECAY_RATE': 0.1}}, 'DATA': {'SAMPLER_MODE': 'causal', 'MEAN': [0.485, 0.456, 0.406], 'STD': [0.229, 0.224, 0.225], 'MAX_SAMPLE_INTERVAL': 200, 'MAX_GAP': 300, 'MAX_INTERVAL': 5, 'INTERVAL_PROB': 0.0, 'TEMP': 2, 'TRAIN': {'DATASETS_NAME': ['LASOT', 'GOT10K_vottrain', 'TRACKINGNET'], 'DATASETS_RATIO': [1, 1, 1], 'SAMPLE_PER_EPOCH': 1000}, 'VAL': {'DATASETS_NAME': ['GOT10K_official_val'], 'DATASETS_RATIO': [1], 'SAMPLE_PER_EPOCH': 10000}, 'SEARCH': {'SIZE': 64, 'FACTOR': 4.0, 'CENTER_JITTER': 3, 'SCALE_JITTER': 0.25, 'NUMBER': 36}, 'TEMPLATE': {'NUMBER': 1, 'SIZE': 128, 'FACTOR': 2.0, 'CENTER_JITTER': 0, 'SCALE_JITTER': 0}}, 'TEST': {'TEMPLATE_FACTOR': 2.0, 'TEMPLATE_SIZE': 128, 'SEARCH_FACTOR': 4.0, 'SEARCH_SIZE': 256, 'EPOCH': 60}} 4 4 4 Error(s) in loading state_dict for ARTrackSeq: size mismatch for backbone.pos_embed_x: copying a param with shape torch.Size([1, 256, 768]) from checkpoint, the shape in current model is torch.Size([1, 16, 768]). size mismatch for pix_head.encoder.x_pos_enc.pos.w_pos: copying a param with shape torch.Size([16, 768]) from checkpoint, the shape in current model is torch.Size([4, 768]). size mismatch for pix_head.encoder.x_pos_enc.pos.h_pos: copying a param with shape torch.Size([16, 768]) from checkpoint, the shape in current model is torch.Size([4, 768]). size mismatch for pix_head.encoder.x_rel_pos_bias_table.relative_position_bias_table: copying a param with shape torch.Size([12, 961]) from checkpoint, the shape in current model is torch.Size([12, 49]). size mismatch for pix_head.encoder.z_x_rel_pos_bias_table.relative_position_bias_table: copying a param with shape torch.Size([12, 529]) from checkpoint, the shape in current model is torch.Size([12, 121]). size mismatch for pix_head.encoder.x_z_rel_pos_bias_table.relative_position_bias_table: copying a param with shape torch.Size([12, 529]) from checkpoint, the shape in current model is torch.Size([12, 121]).

123wfl commented 6 months ago

可以了,谢谢