CaptainEven / MCMOT

Real time one-stage multi-class & multi-object tracking based on anchor-free detection and ReID
MIT License
383 stars 82 forks source link

a problem #14

Open wdxpython opened 3 years ago

wdxpython commented 3 years ago

I used your code for training. But nothing was detected during the test. I only have two categories. opt has been modified. 0 1 0.7802083333333333 0.5222222222222223 0.030208333333333334 0.16666666666666666 1 1 0.5213541666666667 0.4935185185185185 0.09895833333333333 0.13518518518518519 1 2 0.4075520833333333 0.46435185185185185 0.07760416666666667 0.12314814814814815 1 3 0.32265625 0.4634259259259259 0.0734375 0.09351851851851851 1 4 0.53359375 0.39166666666666666 0.0328125 0.05 1 5 0.2513020833333333 0.38657407407407407 0.07760416666666667 0.1824074074074074 1 6 0.4549479166666667 0.3837962962962963 0.09322916666666667 0.16944444444444445

This is my label file.

CaptainEven commented 3 years ago

May i see details of your training log ?

wdxpython commented 3 years ago

tensor([[0.06874, 0.05902, 0.05875, 0.05856, 0.05795, 0.05795, 0.05794, 0.05792, 0.05786, 0.05769, 0.05715, 0.05443, 0.05419, 0.05133, 0.04957, 0.04819, 0.04808, 0.04808, 0.04808, 0.04808, 0.04807, 0.04807, 0.04770, 0.04751, 0.04654, 0.04653, 0.04652, 0.04651, 0.04651, 0.04651, 0.04650, 0.04626, 0.04599, 0.04492, 0.04468, 0.04464, 0.04347, 0.04252, 0.04222, 0.04216, 0.04216, 0.04216, 0.04215, 0.04215, 0.04213, 0.04210, 0.04203, 0.04195, 0.04195, 0.04195, 0.04194, 0.04193, 0.04192, 0.04157, 0.04101, 0.04070, 0.04062, 0.04055, 0.04055, 0.04055, 0.04055, 0.04054, 0.04053, 0.03970, 0.03958, 0.03896, 0.03888, 0.03875, 0.03847, 0.03847, 0.03847, 0.03845, 0.03844, 0.03839, 0.03838, 0.03837, 0.03831, 0.03824, 0.03816, 0.03809, 0.03809, 0.03809, 0.03809, 0.03807, 0.03805, 0.03805, 0.03800, 0.03797, 0.03797, 0.03797, 0.03794, 0.03787, 0.03781, 0.03781, 0.03781, 0.03781, 0.03780, 0.03780, 0.03780, 0.03779, 0.03779, 0.03778, 0.03778, 0.03778, 0.03778, 0.03778, 0.03777, 0.03773, 0.03772, 0.03772, 0.03771, 0.03771, 0.03771, 0.03770, 0.03770, 0.03770, 0.03770, 0.03770, 0.03770, 0.03769, 0.03769, 0.03768, 0.03768, 0.03763, 0.03761, 0.03761, 0.03760, 0.03759]])

the top_k score is too small, could you please give me some help ?

wdxpython commented 3 years ago

2020-07-25-15-23: epoch: 1 |loss 16.706390 | hm_loss 4.010434 | wh_loss 6.783704 | off_loss 0.331461 | id_loss 1.538475 | time 0.133333 | 2020-07-25-15-23: epoch: 2 |loss 11.525661 | hm_loss 2.593460 | wh_loss 6.082298 | off_loss 0.260828 | id_loss 1.385574 | time 0.116667 | 2020-07-25-15-23: epoch: 3 |loss 9.770831 | hm_loss 2.148389 | wh_loss 5.512734 | off_loss 0.255916 | id_loss 1.289775 | time 0.116667 | 2020-07-25-15-23: epoch: 4 |loss 9.080206 | hm_loss 2.004384 | wh_loss 5.197354 | off_loss 0.255996 | id_loss 1.200876 | time 0.116667 | 2020-07-25-15-23: epoch: 5 |loss 8.608916 | hm_loss 1.927440 | wh_loss 4.691648 | off_loss 0.253841 | id_loss 1.163306 | time 0.116667 | 2020-07-25-15-23: epoch: 6 |loss 8.049070 | hm_loss 1.805221 | wh_loss 4.462487 | off_loss 0.252132 | id_loss 1.101103 | time 0.116667 | 2020-07-25-15-23: epoch: 7 |loss 7.629802 | hm_loss 1.735423 | wh_loss 4.084459 | off_loss 0.252223 | id_loss 1.049643 | time 0.116667 | 2020-07-25-15-24: epoch: 8 |loss 7.294641 | hm_loss 1.683209 | wh_loss 3.784338 | off_loss 0.255476 | id_loss 0.993329 | time 0.116667 | 2020-07-25-15-24: epoch: 9 |loss 6.572513 | hm_loss 1.539876 | wh_loss 3.140989 | off_loss 0.249983 | id_loss 0.962837 | time 0.116667 | 2020-07-25-15-24: epoch: 10 |loss 5.748363 | hm_loss 1.395520 | wh_loss 2.292103 | off_loss 0.237810 | id_loss 0.922537 | time 0.116667 | 2020-07-25-15-24: epoch: 11 |loss 5.359005 | hm_loss 1.316424 | wh_loss 1.934236 | off_loss 0.240189 | id_loss 0.900977 | time 0.116667 | 2020-07-25-15-24: epoch: 12 |loss 5.058914 | hm_loss 1.227322 | wh_loss 2.128400 | off_loss 0.233097 | id_loss 0.862485 | time 0.116667 | 2020-07-25-15-24: epoch: 13 |loss 4.729127 | hm_loss 1.168086 | wh_loss 1.843425 | off_loss 0.231113 | id_loss 0.831456 | time 0.116667 | 2020-07-25-15-24: epoch: 14 |loss 4.352957 | hm_loss 1.078359 | wh_loss 1.749127 | off_loss 0.224725 | id_loss 0.802623 | time 0.116667 | 2020-07-25-15-24: epoch: 15 |loss 4.220733 | hm_loss 1.037460 | wh_loss 1.707087 | off_loss 0.229688 | id_loss 0.800281 | time 0.116667 | 2020-07-25-15-25: epoch: 16 |loss 4.244493 | hm_loss 1.069559 | wh_loss 1.628439 | off_loss 0.223213 | id_loss 0.779154 | time 0.116667 | 2020-07-25-15-25: epoch: 17 |loss 3.866145 | hm_loss 0.963731 | wh_loss 1.583025 | off_loss 0.224321 | id_loss 0.756381 | time 0.116667 | 2020-07-25-15-25: epoch: 18 |loss 3.587856 | hm_loss 0.893426 | wh_loss 1.544317 | off_loss 0.213560 | id_loss 0.749986 | time 0.116667 | 2020-07-25-15-25: epoch: 19 |loss 3.594167 | hm_loss 0.893133 | wh_loss 1.532511 | off_loss 0.215036 | id_loss 0.755791 | time 0.150000 | 2020-07-25-15-25: epoch: 20 |loss 3.298381 | hm_loss 0.813497 | wh_loss 1.491555 | off_loss 0.210850 | id_loss 0.743285 | time 0.116667 | 2020-07-25-15-25: epoch: 21 |loss 3.112709 | hm_loss 0.774188 | wh_loss 1.294281 | off_loss 0.204225 | id_loss 0.758639 | time 0.116667 | 2020-07-25-15-25: epoch: 22 |loss 2.758806 | hm_loss 0.690057 | wh_loss 1.213770 | off_loss 0.189154 | id_loss 0.746714 | time 0.116667 | 2020-07-25-15-26: epoch: 23 |loss 2.671970 | hm_loss 0.661786 | wh_loss 1.202637 | off_loss 0.189784 | id_loss 0.749285 | time 0.116667 | 2020-07-25-15-26: epoch: 24 |loss 2.561024 | hm_loss 0.636040 | wh_loss 1.136937 | off_loss 0.185414 | id_loss 0.752422 | time 0.116667 | 2020-07-25-15-26: epoch: 25 |loss 2.578154 | hm_loss 0.643156 | wh_loss 1.184915 | off_loss 0.184276 | id_loss 0.740647 | time 0.116667 | 2020-07-25-15-26: epoch: 26 |loss 2.574612 | hm_loss 0.644176 | wh_loss 1.129168 | off_loss 0.183672 | id_loss 0.749717 | time 0.116667 | 2020-07-25-15-26: epoch: 27 |loss 2.411899 | hm_loss 0.608395 | wh_loss 1.069528 | off_loss 0.179419 | id_loss 0.736862 | time 0.116667 | 2020-07-25-15-26: epoch: 28 |loss 2.452388 | hm_loss 0.613565 | wh_loss 1.076507 | off_loss 0.182253 | id_loss 0.746249 | time 0.116667 | 2020-07-25-15-26: epoch: 29 |loss 2.438402 | hm_loss 0.604570 | wh_loss 1.120330 | off_loss 0.178138 | id_loss 0.755783 | time 0.116667 | 2020-07-25-15-27: epoch: 30 |loss 2.379139 | hm_loss 0.589563 | wh_loss 1.073109 | off_loss 0.181859 | id_loss 0.749417 | time 0.116667 | 2020-07-25-15-27: epoch: 31 |loss 2.370215 | hm_loss 0.584263 | wh_loss 1.074369 | off_loss 0.181206 | id_loss 0.756079 | time 0.116667 | 2020-07-25-15-27: epoch: 32 |loss 2.394312 | hm_loss 0.588443 | wh_loss 1.087430 | off_loss 0.184744 | id_loss 0.753074 | time 0.116667 | 2020-07-25-15-27: epoch: 33 |loss 2.354809 | hm_loss 0.577970 | wh_loss 1.078386 | off_loss 0.175335 | id_loss 0.771379 | time 0.116667 | 2020-07-25-15-27: epoch: 34 |loss 2.409918 | hm_loss 0.599868 | wh_loss 1.071716 | off_loss 0.181526 | id_loss 0.749375 | time 0.116667 | 2020-07-25-15-27: epoch: 35 |loss 2.378381 | hm_loss 0.592600 | wh_loss 1.099319 | off_loss 0.175975 | id_loss 0.749422 | time 0.116667 | 2020-07-25-15-27: epoch: 36 |loss 2.379342 | hm_loss 0.590402 | wh_loss 1.103652 | off_loss 0.180813 | id_loss 0.743266 | time 0.116667 | 2020-07-25-15-27: epoch: 37 |loss 2.331605 | hm_loss 0.583960 | wh_loss 1.071926 | off_loss 0.173788 | id_loss 0.746487 | time 0.116667 | 2020-07-25-15-28: epoch: 38 |loss 2.338724 | hm_loss 0.575330 | wh_loss 1.082806 | off_loss 0.178210 | id_loss 0.758501 | time 0.116667 | 2020-07-25-15-28: epoch: 39 |loss 2.319589 | hm_loss 0.570639 | wh_loss 1.056719 | off_loss 0.180713 | id_loss 0.755622 | time 0.116667 | 2020-07-25-15-28: epoch: 40 |loss 2.350834 | hm_loss 0.576301 | wh_loss 1.077339 | off_loss 0.180297 | id_loss 0.761514 | time 0.116667 |

wdxpython commented 3 years ago

==> torch version: 1.2.0 ==> cudnn version: 7602 ==> Cmd: ['train.py', '--exp_id', 'all_hrnet', '--gpus', '0', '--batch_size', '32', '--reid_dim', '128', '--arch', 'hrnet_18'] ==> Opt: K: 128 arch: hrnet_18 batch_size: 32 cat_spec_wh: False chunk_sizes: [32] conf_thres: 0.4 data_cfg: ../src/lib/cfg/road.json data_dir: /mnt/diskb/even/dataset dataset: jde debug_dir: /media/wdx/4AB67FF0B67FDB41/MCMOT/src/lib/../../exp/mot/all_hrnet/debug dense_wh: False det_thres: 0.3 down_ratio: 4 exp_dir: /media/wdx/4AB67FF0B67FDB41/MCMOT/src/lib/../../exp/mot exp_id: all_hrnet fix_res: True gpus: [0] gpus_str: 0 head_conv: 256 heads: {'hm': 2, 'wh': 2, 'id': 128, 'reg': 2} hide_data_time: False hm_weight: 1 id_loss: ce id_weight: 1 input_h: 1088 input_img: /users/duanyou/c5/all_pretrain/test.txt input_mode: video input_res: 1088 input_video: ../videos/test5.mp4 input_w: 608 is_debug: False keep_res: False load_model: lr: 0.0001 lr_step: [20, 27] master_batch_size: 32 mean: None metric: loss min_box_area: 200 mse_loss: False nID_dict: defaultdict(<class 'int'>, {0: 1011, 1: 1046}) nms_thres: 0.4 norm_wh: False not_cuda_benchmark: False not_prefetch_test: False not_reg_offset: False num_classes: 2 num_epochs: 40 num_iters: -1 num_stacks: 1 num_workers: 8 off_weight: 1 output_format: video output_h: 272 output_res: 272 output_root: ../results output_w: 152 pad: 31 print_iter: 0 reg_loss: l1 reg_offset: True reid_cls_ids: 0,1 reid_dim: 128 resume: False root_dir: /media/wdx/4AB67FF0B67FDB41/MCMOT/src/lib/../.. save_all: False save_dir: /media/wdx/4AB67FF0B67FDB41/MCMOT/src/lib/../../exp/mot/all_hrnet seed: 317 std: None task: mot test: False test_mot15: False test_mot16: False test_mot17: False test_mot20: False track_buffer: 300 trainval: False val_intervals: 10 val_mot15: False val_mot16: False val_mot17: False val_mot20: False vis_thresh: 0.5 wh_weight: 0.1

CaptainEven commented 3 years ago

May i see the details of your test printing?

wdxpython commented 3 years ago

def test_single(img_path, dev): """ :param img_path: :param dev: :return: """ if not os.path.isfile(img_path): print('[Err]: invalid image path.') return

# Load model and put to device
heads = {'hm': 2, 'reg': 2, 'wh': 2, 'id': 128}
net = create_model(arch='hrnet_18', heads=heads, head_conv=-1)
#print(net)
model_path = '/media/wdx/4AB67FF0B67FDB41/MCMOT/models/model_last.pth'
net = load_model(model=net, model_path=model_path)
net = net.to(dev)
net.eval()

# Read image
img_0 = cv2.imread(img_path)  # BGR
assert img_0 is not None, 'Failed to load ' + img_path

# Padded resize
h_in, w_in = 320, 320  # (608, 1088) (320, 640)
img, _, _, _ = letterbox(img=img_0, height=h_in, width=w_in)

# Normalize RGB: BGR -> RGB and H×W×C -> C×H×W
img = img[:, :, ::-1].transpose(2, 0, 1)
img = np.ascontiguousarray(img, dtype=np.float32)
img /= 255.0

# Convert to tensor and put to device
blob = torch.from_numpy(img).unsqueeze(0).to(dev)

with torch.no_grad():
    # Network output
    output = net.forward(blob)[-1]

    # Tracking output
    hm = output['hm'].sigmoid_()
    reg = output['reg']
    wh = output['wh']
    id_feature = output['id']
    id_feature = F.normalize(id_feature, dim=1)  # L2 normalize

    # Decode output
    dets, inds, cls_inds_mask = mot_decode(hm, wh, reg, 2, False, 128)

    # Get ReID feature vector by object class
    cls_id_feats = []  # topK feature vectors of each object class
    for cls_id in range(2):  # cls_id starts from 0
        # get inds of each object class
        cls_inds = inds[:, cls_inds_mask[cls_id]]

        # gather feats for each object class
        cls_id_feature = _tranpose_and_gather_feat(id_feature, cls_inds)  # inds: 1×128
        cls_id_feature = cls_id_feature.squeeze(0)  # n × FeatDim
        if dev == 'cpu':
            cls_id_feature = cls_id_feature.numpy()
        else:
            cls_id_feature = cls_id_feature.cpu().numpy()
        cls_id_feats.append(cls_id_feature)

    # Convert back to original image coordinate system
    height_0, width_0 = img_0.shape[0], img_0.shape[1]  # H, W of original input image
    dets = map2orig(dets, h_in // 4, w_in // 4, height_0, width_0, 2)  # translate and scale

    # Parse detections of each class
    dets_dict = defaultdict(list)
    for cls_id in range(2):  # cls_id start from index 0
        cls_dets = dets[cls_id]
        # filter out low conf score dets
        remain_inds = cls_dets[:, 4] > 0.4
        cls_dets = cls_dets[remain_inds]
        # cls_id_feature = cls_id_feats[cls_id][remain_inds]  # if need re-id
        dets_dict[cls_id] = cls_dets

# Visualize detection results
img_draw = plot_detects(img_0, dets_dict, 2, frame_id=0, fps=30.0)
cv2.imshow('Detection', img_draw)
cv2.waitKey()
wdxpython commented 3 years ago

tensor([[[[0.04711, 0.03935, 0.04080, ..., 0.04247, 0.04415, 0.06830], [0.03039, 0.02412, 0.02653, ..., 0.02794, 0.02994, 0.05236], [0.03160, 0.02299, 0.02343, ..., 0.02548, 0.02898, 0.05258], ..., [0.03337, 0.02457, 0.02546, ..., 0.02669, 0.02662, 0.05000], [0.03635, 0.02653, 0.02875, ..., 0.02519, 0.02711, 0.05026], [0.04489, 0.03916, 0.03674, ..., 0.03430, 0.03305, 0.05366]],

     [[0.04060, 0.02599, 0.02604,  ..., 0.02744, 0.02663, 0.04787],
      [0.02749, 0.01598, 0.01731,  ..., 0.01705, 0.01761, 0.03215],
      [0.02628, 0.01487, 0.01625,  ..., 0.01553, 0.01803, 0.03336],
      ...,
      [0.02840, 0.01718, 0.01767,  ..., 0.01754, 0.01734, 0.03329],
      [0.02861, 0.01729, 0.01972,  ..., 0.01712, 0.01684, 0.03204],
      [0.04224, 0.02825, 0.02934,  ..., 0.02623, 0.02662, 0.03995]]]])

tensor([[0.06830, 0.05502, 0.05445, 0.05414, 0.05414, 0.05413, 0.05408, 0.05407, 0.05407, 0.05388, 0.05378, 0.05374, 0.05373, 0.05369, 0.05367, 0.05366, 0.05358, 0.05345, 0.05343, 0.05315, 0.05308, 0.05282, 0.05275, 0.05274, 0.05274, 0.05273, 0.05267, 0.05265, 0.05258, 0.05237, 0.05183, 0.05114, 0.04998, 0.04787, 0.04711, 0.04489, 0.04454, 0.04448, 0.04448, 0.04447, 0.04445, 0.04444, 0.04443, 0.04435, 0.04417, 0.04417, 0.04412, 0.04412, 0.04411, 0.04409, 0.04407, 0.04399, 0.04393, 0.04341, 0.04224, 0.04191, 0.04183, 0.04182, 0.04182, 0.04182, 0.04181, 0.04181, 0.04177, 0.04126, 0.04060, 0.03995, 0.03816, 0.03575, 0.03573, 0.03573, 0.03573, 0.03572, 0.03572, 0.03571, 0.03567, 0.03559, 0.03559, 0.03558, 0.03558, 0.03558, 0.03558, 0.03558, 0.03557, 0.03541, 0.03489, 0.03484, 0.03479, 0.03479, 0.03479, 0.03479, 0.03478, 0.03476, 0.03475, 0.03470, 0.03460, 0.03435, 0.03435, 0.03435, 0.03435, 0.03433, 0.03430, 0.03423, 0.03422, 0.03403, 0.03402, 0.03402, 0.03402, 0.03399, 0.03399, 0.03398, 0.03398, 0.03397, 0.03397, 0.03397, 0.03397, 0.03396, 0.03396, 0.03395, 0.03393, 0.03391, 0.03391, 0.03391, 0.03391, 0.03390, 0.03390, 0.03389, 0.03389, 0.03388]]) tensor([[[ 7.90638e+01, -3.67965e-01, 7.93357e+01, 6.18700e-01, 6.83022e-02, 0.00000e+00], [ 7.92027e+01, 7.35409e+01, 7.94364e+01, 7.49733e+01, 5.50156e-02, 0.00000e+00], [ 7.91299e+01, 7.14896e+01, 7.94680e+01, 7.30581e+01, 5.44519e-02, 0.00000e+00], [ 7.92019e+01, 2.55441e+01, 7.94375e+01, 2.69629e+01, 5.41382e-02, 0.00000e+00], [ 7.92019e+01, 3.35442e+01, 7.94375e+01, 3.49628e+01, 5.41380e-02, 0.00000e+00], [ 7.92019e+01, 4.15441e+01, 7.94376e+01, 4.29628e+01, 5.41298e-02, 0.00000e+00], [ 7.92014e+01, 4.95441e+01, 7.94379e+01, 5.09630e+01, 5.40782e-02, 0.00000e+00], [ 7.92008e+01, 5.75442e+01, 7.94382e+01, 5.89633e+01, 5.40713e-02, 0.00000e+00], [ 7.92030e+01, 1.75462e+01, 7.94359e+01, 1.89579e+01, 5.40658e-02, 0.00000e+00], [ 7.91967e+01, 9.54430e+00, 7.94377e+01, 1.09616e+01, 5.38782e-02, 0.00000e+00], [ 7.91300e+01, 2.34990e+01, 7.94694e+01, 2.50534e+01, 5.37788e-02, 0.00000e+00], [ 7.91290e+01, 3.14992e+01, 7.94692e+01, 3.30544e+01, 5.37390e-02, 0.00000e+00], [ 7.91290e+01, 3.94991e+01, 7.94692e+01, 4.10543e+01, 5.37323e-02, 0.00000e+00], [ 7.91288e+01, 4.74989e+01, 7.94695e+01, 4.90543e+01, 5.36942e-02, 0.00000e+00], [ 7.91284e+01, 5.54987e+01, 7.94697e+01, 5.70543e+01, 5.36654e-02, 0.00000e+00], [ 7.90626e+01, 7.87430e+01, 7.93553e+01, 7.98677e+01, 5.36612e-02, 0.00000e+00], [ 7.91281e+01, 1.55012e+01, 7.94688e+01, 1.70529e+01, 5.35826e-02, 0.00000e+00], [ 7.91667e+01, 7.56150e+01, 7.94397e+01, 7.70461e+01, 5.34456e-02, 0.00000e+00], [ 7.91227e+01, 7.49614e+00, 7.94588e+01, 9.02112e+00, 5.34290e-02, 0.00000e+00], [ 7.91953e+01, 6.55452e+01, 7.94432e+01, 6.69625e+01, 5.31525e-02, 0.00000e+00], [ 7.91272e+01, 6.34999e+01, 7.94734e+01, 6.50518e+01, 5.30823e-02, 0.00000e+00], [ 7.91665e+01, 1.96117e+01, 7.94412e+01, 2.10321e+01, 5.28198e-02, 0.00000e+00], [ 7.91652e+01, 3.60888e+00, 7.94385e+01, 5.00083e+00, 5.27500e-02, 0.00000e+00], [ 7.91649e+01, 2.76107e+01, 7.94419e+01, 2.90366e+01, 5.27428e-02, 0.00000e+00], [ 7.91651e+01, 3.56108e+01, 7.94419e+01, 3.70364e+01, 5.27389e-02, 0.00000e+00], [ 7.91649e+01, 4.36107e+01, 7.94420e+01, 4.50365e+01, 5.27289e-02, 0.00000e+00], [ 7.91643e+01, 5.16108e+01, 7.94423e+01, 5.30365e+01, 5.26677e-02, 0.00000e+00], [ 7.91636e+01, 5.96108e+01, 7.94425e+01, 6.10367e+01, 5.26540e-02, 0.00000e+00], [ 7.91876e+01, 1.54796e+00, 7.94294e+01, 2.98037e+00, 5.25819e-02, 0.00000e+00], [ 7.91604e+01, 1.16102e+01, 7.94437e+01, 1.30361e+01, 5.23711e-02, 0.00000e+00], [ 7.91575e+01, 6.76072e+01, 7.94479e+01, 6.90320e+01, 5.18331e-02, 0.00000e+00], [ 7.91533e+01, 5.53014e+00, 7.94211e+01, 6.95574e+00, 5.11384e-02, 0.00000e+00], [ 7.91528e+01, 6.95298e+01, 7.94262e+01, 7.10122e+01, 4.99848e-02, 0.00000e+00], [ 7.90638e+01, -3.67965e-01, 7.93357e+01, 6.18700e-01, 4.78697e-02, 1.00000e+00], [ 8.52555e-02, -3.26993e-01, 4.04025e-01, 4.96433e-01, 4.71150e-02, 0.00000e+00], [ 4.20971e-02, 7.86775e+01, 3.57235e-01, 7.96152e+01, 4.48916e-02, 0.00000e+00], [ 6.21862e+01, -5.72737e-01, 6.25353e+01, 6.57384e-01, 4.45394e-02, 0.00000e+00], [ 5.41863e+01, -5.72282e-01, 5.45356e+01, 6.56822e-01, 4.44830e-02, 0.00000e+00], [ 4.61864e+01, -5.72133e-01, 4.65356e+01, 6.56740e-01, 4.44817e-02, 0.00000e+00], [ 3.81864e+01, -5.72046e-01, 3.85356e+01, 6.56676e-01, 4.44732e-02, 0.00000e+00], [ 3.01865e+01, -5.71667e-01, 3.05356e+01, 6.56582e-01, 4.44494e-02, 0.00000e+00], [ 2.21865e+01, -5.70911e-01, 2.25353e+01, 6.56434e-01, 4.44363e-02, 0.00000e+00], [ 7.01865e+01, -5.68917e-01, 7.05340e+01, 6.56224e-01, 4.44281e-02, 0.00000e+00], [ 1.41885e+01, -5.64447e-01, 1.45344e+01, 6.55414e-01, 4.43489e-02, 0.00000e+00], [ 5.82091e+01, -5.78412e-01, 5.85277e+01, 6.02535e-01, 4.41737e-02, 0.00000e+00], [ 6.62097e+01, -5.77815e-01, 6.65299e+01, 6.04644e-01, 4.41671e-02, 0.00000e+00], [ 5.02092e+01, -5.77840e-01, 5.05280e+01, 6.02016e-01, 4.41238e-02, 0.00000e+00], [ 4.22093e+01, -5.77596e-01, 4.25279e+01, 6.01899e-01, 4.41195e-02, 0.00000e+00], [ 3.42093e+01, -5.77535e-01, 3.45280e+01, 6.01867e-01, 4.41115e-02, 0.00000e+00], [ 2.62095e+01, -5.77359e-01, 2.65283e+01, 6.01855e-01, 4.40873e-02, 0.00000e+00], [ 1.82096e+01, -5.76663e-01, 1.85281e+01, 6.01487e-01, 4.40722e-02, 0.00000e+00], [ 7.42043e+01, -5.91818e-01, 7.45277e+01, 6.01674e-01, 4.39945e-02, 0.00000e+00], [ 1.02121e+01, -5.73663e-01, 1.05310e+01, 6.02028e-01, 4.39304e-02, 0.00000e+00], [ 6.17614e+00, -5.69885e-01, 6.53404e+00, 6.51309e-01, 4.34110e-02, 0.00000e+00], [ 4.20971e-02, 7.86775e+01, 3.57235e-01, 7.96152e+01, 4.22368e-02, 1.00000e+00], [ 6.50947e+01, 7.85426e+01, 6.54136e+01, 7.99829e+01, 4.19064e-02, 0.00000e+00], [ 1.70938e+01, 7.85447e+01, 1.74106e+01, 7.99757e+01, 4.18275e-02, 0.00000e+00], [ 5.70936e+01, 7.85447e+01, 5.74111e+01, 7.99773e+01, 4.18184e-02, 0.00000e+00], [ 4.10936e+01, 7.85449e+01, 4.14107e+01, 7.99771e+01, 4.18183e-02, 0.00000e+00], [ 3.30936e+01, 7.85449e+01, 3.34107e+01, 7.99771e+01, 4.18163e-02, 0.00000e+00], [ 4.90936e+01, 7.85449e+01, 4.94109e+01, 7.99772e+01, 4.18113e-02, 0.00000e+00], [ 2.50936e+01, 7.85450e+01, 2.54108e+01, 7.99768e+01, 4.18077e-02, 0.00000e+00], [ 9.09325e+00, 7.85452e+01, 9.41197e+00, 7.99724e+01, 4.17722e-02, 0.00000e+00], [ 7.30929e+01, 7.85500e+01, 7.34123e+01, 7.99684e+01, 4.12588e-02, 0.00000e+00], [ 8.52555e-02, -3.26993e-01, 4.04025e-01, 4.96433e-01, 4.05957e-02, 1.00000e+00], [ 7.90626e+01, 7.87430e+01, 7.93553e+01, 7.98677e+01, 3.99533e-02, 1.00000e+00], [ 3.05507e+00, 7.85500e+01, 3.45746e+00, 7.99161e+01, 3.81633e-02, 0.00000e+00], [ 1.74281e-01, 6.95688e+01, 5.30405e-01, 7.07448e+01, 3.57480e-02, 0.00000e+00], [ 1.71374e-01, 2.95700e+01, 5.22038e-01, 3.07371e+01, 3.57284e-02, 0.00000e+00], [ 1.71257e-01, 4.55700e+01, 5.22041e-01, 4.67372e+01, 3.57268e-02, 0.00000e+00], [ 1.71270e-01, 3.75701e+01, 5.21941e-01, 3.87369e+01, 3.57255e-02, 0.00000e+00], [ 1.72484e-01, 2.15730e+01, 5.19493e-01, 2.27329e+01, 3.57244e-02, 0.00000e+00], [ 1.71485e-01, 6.15692e+01, 5.22919e-01, 6.27390e+01, 3.57235e-02, 0.00000e+00], [ 1.71243e-01, 5.35697e+01, 5.22549e-01, 5.47378e+01, 3.57149e-02, 0.00000e+00], [ 1.69592e-01, 1.35652e+01, 5.24776e-01, 1.47434e+01, 3.56688e-02, 0.00000e+00], [ 4.50631e+01, 7.85444e+01, 4.54263e+01, 8.00039e+01, 3.55876e-02, 0.00000e+00], [ 3.70631e+01, 7.85444e+01, 3.74263e+01, 8.00038e+01, 3.55873e-02, 0.00000e+00], [ 2.10628e+01, 7.85440e+01, 2.14263e+01, 8.00021e+01, 3.55844e-02, 0.00000e+00], [ 2.90629e+01, 7.85444e+01, 2.94263e+01, 8.00035e+01, 3.55832e-02, 0.00000e+00], [ 6.10629e+01, 7.85442e+01, 6.14266e+01, 8.00042e+01, 3.55818e-02, 0.00000e+00], [ 5.30630e+01, 7.85443e+01, 5.34264e+01, 8.00040e+01, 3.55775e-02, 0.00000e+00], [ 1.30602e+01, 7.85434e+01, 1.34277e+01, 8.00000e+01, 3.55755e-02, 0.00000e+00], [ 6.90617e+01, 7.85407e+01, 6.94297e+01, 8.00128e+01, 3.55699e-02, 0.00000e+00], [ 7.91667e+01, 7.56150e+01, 7.94397e+01, 7.70461e+01, 3.54149e-02, 1.00000e+00], [ 5.06624e+00, 7.85541e+01, 5.42849e+00, 7.99929e+01, 3.48870e-02, 0.00000e+00], [ 7.91665e+01, 1.96117e+01, 7.94412e+01, 2.10321e+01, 3.48442e-02, 1.00000e+00], [ 7.91636e+01, 5.96108e+01, 7.94425e+01, 6.10367e+01, 3.47905e-02, 1.00000e+00], [ 7.91651e+01, 3.56108e+01, 7.94419e+01, 3.70364e+01, 3.47898e-02, 1.00000e+00], [ 7.91649e+01, 2.76107e+01, 7.94419e+01, 2.90366e+01, 3.47893e-02, 1.00000e+00], [ 7.91649e+01, 4.36107e+01, 7.94420e+01, 4.50365e+01, 3.47866e-02, 1.00000e+00], [ 7.91643e+01, 5.16108e+01, 7.94423e+01, 5.30365e+01, 3.47757e-02, 1.00000e+00], [ 1.54093e-01, 5.59417e+00, 5.18654e-01, 6.70094e+00, 3.47613e-02, 0.00000e+00], [ 7.91604e+01, 1.16102e+01, 7.94437e+01, 1.30361e+01, 3.47474e-02, 1.00000e+00], [ 1.63515e-01, 7.45422e+01, 5.39396e-01, 7.57636e+01, 3.46969e-02, 0.00000e+00], [ 7.91575e+01, 6.76072e+01, 7.94479e+01, 6.90320e+01, 3.45976e-02, 1.00000e+00], [ 1.53481e-01, 5.85456e+01, 5.34285e-01, 5.97472e+01, 3.43511e-02, 0.00000e+00], [ 1.53099e-01, 4.25468e+01, 5.33312e-01, 4.37458e+01, 3.43478e-02, 0.00000e+00], [ 1.53063e-01, 3.45470e+01, 5.33196e-01, 3.57456e+01, 3.43468e-02, 0.00000e+00], [ 1.53158e-01, 2.65468e+01, 5.33297e-01, 2.77458e+01, 3.43465e-02, 0.00000e+00], [ 1.53127e-01, 5.05464e+01, 5.33909e-01, 5.17465e+01, 3.43307e-02, 0.00000e+00], [ 7.70533e+01, 7.85565e+01, 7.74367e+01, 7.99927e+01, 3.42958e-02, 0.00000e+00], [ 1.52891e-01, 1.85482e+01, 5.31067e-01, 1.97417e+01, 3.42289e-02, 0.00000e+00], [ 1.52129e-01, 1.05388e+01, 5.38728e-01, 1.17520e+01, 3.42159e-02, 0.00000e+00], [ 7.91579e+01, 2.62343e+00, 7.94015e+01, 3.97309e+00, 3.40344e-02, 1.00000e+00], [ 5.92509e+01, 3.22252e+01, 5.96041e+01, 3.40986e+01, 3.40217e-02, 0.00000e+00], [ 5.92510e+01, 2.42246e+01, 5.96043e+01, 2.60990e+01, 3.40175e-02, 0.00000e+00], [ 5.92508e+01, 4.02248e+01, 5.96043e+01, 4.20990e+01, 3.40162e-02, 0.00000e+00], [ 4.32504e+01, 4.02263e+01, 4.36028e+01, 4.20968e+01, 3.39866e-02, 0.00000e+00], [ 4.32505e+01, 3.22268e+01, 4.36025e+01, 3.40964e+01, 3.39861e-02, 0.00000e+00], [ 4.32508e+01, 2.42261e+01, 4.36028e+01, 2.60967e+01, 3.39790e-02, 0.00000e+00], [ 5.12505e+01, 3.22262e+01, 5.16032e+01, 3.40969e+01, 3.39754e-02, 0.00000e+00], [ 5.12504e+01, 4.02258e+01, 5.16035e+01, 4.20973e+01, 3.39734e-02, 0.00000e+00], [ 3.52504e+01, 4.02266e+01, 3.56028e+01, 4.20965e+01, 3.39694e-02, 0.00000e+00], [ 5.12507e+01, 2.42256e+01, 5.16035e+01, 2.60973e+01, 3.39689e-02, 0.00000e+00], [ 3.52504e+01, 3.22271e+01, 3.56025e+01, 3.40961e+01, 3.39679e-02, 0.00000e+00], [ 7.91299e+01, 7.14896e+01, 7.94680e+01, 7.30581e+01, 3.39623e-02, 1.00000e+00], [ 3.52507e+01, 2.42264e+01, 3.56028e+01, 2.60965e+01, 3.39607e-02, 0.00000e+00], [ 5.92502e+01, 4.82243e+01, 5.96056e+01, 5.01003e+01, 3.39454e-02, 0.00000e+00], [ 4.32499e+01, 4.82255e+01, 4.36043e+01, 5.00984e+01, 3.39269e-02, 0.00000e+00], [ 3.52499e+01, 4.82258e+01, 3.56043e+01, 5.00982e+01, 3.39134e-02, 0.00000e+00], [ 2.72503e+01, 4.02271e+01, 2.76031e+01, 4.20962e+01, 3.39106e-02, 0.00000e+00], [ 5.12499e+01, 4.82251e+01, 5.16049e+01, 5.00989e+01, 3.39090e-02, 0.00000e+00], [ 2.72503e+01, 3.22276e+01, 2.76029e+01, 3.40957e+01, 3.39083e-02, 0.00000e+00], [ 5.92498e+01, 5.62222e+01, 5.96068e+01, 5.81031e+01, 3.39031e-02, 0.00000e+00], [ 2.72506e+01, 2.42269e+01, 2.76032e+01, 2.60961e+01, 3.39019e-02, 0.00000e+00], [ 4.32496e+01, 5.62232e+01, 4.36058e+01, 5.81018e+01, 3.38881e-02, 0.00000e+00], [ 1.52538e-01, 6.65402e+01, 5.43384e-01, 6.77569e+01, 3.38862e-02, 0.00000e+00], [ 3.52496e+01, 5.62234e+01, 3.56058e+01, 5.81016e+01, 3.38780e-02, 0.00000e+00]]])

wdxpython commented 3 years ago

this is the hm and dets

wdxpython commented 3 years ago

hm还是网络的输出,没经过其他的处理,打印出来看值都太小,在进行阈值筛选的时候全部过滤掉了,那么就是训练的问题,但是输入大小训练和测试是相同的,您的类别数是5,我也在opt里面进行修改了,那还有哪些地方需要进行修改呢?

wdxpython commented 3 years ago

我有两个类别 在label里面一个是0,一个是1 这样对吗?我看您说背景是0,我应该写成1和2?

CaptainEven commented 3 years ago

这里注释错了,第一版MCMOT的首背景是0,现在这版MCMOT,cls_id从0开始即可,你先让训练分辨率跟测试分辨率相同,然后我看看测试过程的打印信息?

wdxpython commented 3 years ago

目前的训练和测试的分辨率都是320,测试过程您想看什么信息,我打印出来

wdxpython commented 3 years ago

为什么我把batch['hm']打印出来全是0呢?

CaptainEven commented 3 years ago

运行test_single的打印输出信息

wdxpython commented 3 years ago

如果用1088,608的输入就没有问题,但是我都改成320了,为啥测试就不对呢,有哪些地方需要注意的吗?

CaptainEven commented 3 years ago

目前,只有单分辨率训练,没有像yolo一样采用多分辨率训练策略,测试分辨率比训练分辨率小,准确率会下降,一般不至于完全没有检测到。

wdxpython commented 3 years ago

可是我训练也用的320

wdxpython commented 3 years ago

可能我没有说清楚,如果用1088,608训练1088,608测试,没问题,测试正常。如果320,320训练,320,320测试,什么都检测不到,尺寸的地方我确定全部都改了,还有什么其他的参数需要修改吗?

CaptainEven commented 3 years ago

对于目标检测来说,分辨率太低,对某些数据集,可能检测不到

CaptainEven commented 3 years ago

当然,也可能是你某个地方两个分辨率不一致

chengdianJiang commented 3 years ago

我也遇到了这种问题,我train和tracking时候用的图片统一成416 416,结果检测的效果很差,基本没有检测对的上的,训练48轮,刚开始loss收敛不错,40轮后loss小于0 了,作者可以给出一些解决思路吗?