zyayoung / Iter-Deformable-DETR

[CVPR2022] "Progressive End-to-End Object Detection in Crowded Scenes" on Deformable-DETR.
28 stars 6 forks source link

assert self._width is not None and self._height is not None #1

Closed ma3252788 closed 2 years ago

ma3252788 commented 2 years ago

Hello, thank you very much for your work. I first trained for 2 epochs, then started the test, and the result reported an error:

loading annotations into memory...
Done (t=0.32s)
creating index...
index created!
saved                                                                                                                                                                                                              
Traceback (most recent call last):
  File "demo_opt.py", line 472, in <module>
    eval_results = _evaluate_predictions_on_crowdhuman(gt_path, fpath)
  File "demo_opt.py", line 448, in _evaluate_predictions_on_crowdhuman
    database = Database(gt_path, dt_path, target_key, None, mode)
  File "demo_opt.py", line 312, in __init__
    self.loadData(dtpath, body_key, head_key, if_gt=False)
  File "demo_opt.py", line 331, in loadData
    self.images[record["ID"]].clip_all_boader()
  File "demo_opt.py", line 195, in clip_all_boader
    assert self._width is not None and self._height is not None
AssertionError

I followed the instructions completely, but why did it report an error?

zyayoung commented 2 years ago

We embed the width and height info in the annotation file. You may try the following script to do this.

import json

from tqdm import tqdm
from PIL import Image

src = '/data/datasets/crowdhuman/annotation_val.odgt'
dst = '/data/datasets/crowdhuman/annotation_val_hw.odgt'
img_path = '/data/datasets/crowdhuman/train_image/'

with open(src, 'r') as f:
    lines = f.readlines()

with open(dst, 'w') as f:
    for line in tqdm(lines):
        record = json.loads(line)
        file_path = img_path + '{}.jpg'.format(record['ID'])
        im = Image.open(file_path)
        record['width'], record['height'] = im.size
        f.write(json.dumps(record) + "\n")

Sorry for the inconvenience.