stefanopini / simple-HRNet

Multi-person Human Pose Estimation with HRNet in Pytorch
GNU General Public License v3.0
579 stars 108 forks source link

Evaluate custom dataset #104

Closed dorakementzey closed 1 year ago

dorakementzey commented 1 year ago

Hi, I would like to evaluate the Simple hrnet model (that i am currently running colab for gpu access). I have a coco style json file of ground truth values, and the corresponding dataset. I tried running test.py with my image directory, however it did not work. Is there a way to either: run evaluation on my dataset with coco_ground_truth.json, or export json file of inference on the dataset. Any help would be appreciated, thanks! What I ran before:

!python testing/Test.py --cfg  \
                     DATASET.TEST_SET testing/test_img_set \
                     TEST.MODEL_FILE ./weights/pose_hrnet_w32_256x192.pth \
                     TEST.SCALE_LIST 0.5,0.75,1.0,1.25,1.5,1.75 \
                     TEST.FLIP_TEST False

The output i generated:

Traceback (most recent call last):
  File "/content/simple-HRNet/testing/Test.py", line 8, in <module>
    from datasets.HumanPoseEstimation import HumanPoseEstimationDataset
ModuleNotFoundError: No module named 'datasets
stefanopini commented 1 year ago

Hi @dorakementzey , I'm sorry I haven't implemented a standalone testing function for the COCO dataset.

However, your dataset should be compatible and you should be able to run an evaluation by comment this line https://github.com/stefanopini/simple-HRNet/blob/6bfcdaf5bcb006b945af9735883b198a54f62d4c/training/Train.py#L354 then running

python scripts/train_coco.py [your params]

Alternatively, you can run the extract-keypoints script to save the outputs in csv or json format. You probably need to post-process to evaluate them with cocoapi in this case.

dorakementzey commented 1 year ago

Thanks @stefanopini for your reply! When I download the Train.py file from colab in order to change that line, the file is empty. I tried rerunning several times but this always happens. (I use the official linked colab notebook) Edit: I tried opening the file in colab and that way it is not empty. I ran:

!python ./scripts/train_coco.py --exp_name result --disable_flip_test_images --coco_root_path ./datasets/my_dataset 

Additionally, I commented the Train.py line as well as changin the data name in train_coco.py as follows to point to my evaluation file

ds_val = COCODataset(
        root_path=coco_root_path, data_version="test_img_set", is_train=False, use_gt_bboxes=(coco_bbox_path is None),
        bbox_path=coco_bbox_path, image_width=image_resolution[1], image_height=image_resolution[0], color_rgb=True,
    )

This gave the following error:

Traceback (most recent call last):
  File "/content/simple-HRNet/./scripts/train_coco.py", line 12, in <module>
    from datasets.COCO import COCODataset
  File "/content/simple-HRNet/datasets/COCO.py", line 9, in <module>
    import json_tricks as json
ModuleNotFoundError: No module named 'json_tricks'

About the second solution I also tried running the extract-keypoints script, I don't mind post processing with the COCO API:

!python ./scripts/extract-keypoints.py --format JSON --filename my_image.png --json_output_filename hrnet_res.json  --image_resolution '(1244,937)'

From this, I got the following error message:

Traceback (most recent call last):
  File "/content/simple-HRNet/./scripts/extract-keypoints.py", line 183, in <module>
    main(**args.__dict__)
  File "/content/simple-HRNet/./scripts/extract-keypoints.py", line 37, in main
    assert format in ('csv', 'json')
AssertionError
stefanopini commented 1 year ago

On the first issue, could you try to additionally run !pip install json-tricks at the beginning of the notebook?

On the second issue, the code is a bit pedantic and requires the format as lowercase --format json 😄

dorakementzey commented 1 year ago

@stefanopini Thanks for your prompt reply! That was a bit silly of me, I have run both of your suggestions, however, I still run into issues. After installing json tricks, I get the following error:

Traceback (most recent call last):
  File "/content/simple-HRNet/./scripts/train_coco.py", line 12, in <module>
    from datasets.COCO import COCODataset
  File "/content/simple-HRNet/datasets/COCO.py", line 16, in <module>
    from misc.nms.nms import oks_nms
  File "/content/simple-HRNet/misc/nms/nms.py", line 13, in <module>
    from cpu_nms import cpu_nms
ModuleNotFoundError: No module named 'cpu_nms'

I tried installing cpu_nms and got the following:

ERROR: Could not find a version that satisfies the requirement cpu_nms (from versions: none)
ERROR: No matching distribution found for cpu_nms

Running the extract keypoints script, I got the following:

device: 'cuda' - 1 GPU(s) will be used
Traceback (most recent call last):
  File "/content/simple-HRNet/./scripts/extract-keypoints.py", line 183, in <module>
    main(**args.__dict__)
  File "/content/simple-HRNet/./scripts/extract-keypoints.py", line 99, in main
    pts = model.predict(frame)
  File "/content/simple-HRNet/SimpleHRNet.py", line 206, in predict
    return self._predict_single(image)
  File "/content/simple-HRNet/SimpleHRNet.py", line 286, in _predict_single
    out = self.model(images)
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/parallel/data_parallel.py", line 169, in forward
    return self.module(*inputs[0], **kwargs[0])
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/content/simple-HRNet/models_/hrnet.py", line 168, in forward
    x = self.stage2(x)
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/container.py", line 217, in forward
    input = module(input)
  File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/content/simple-HRNet/models_/hrnet.py", line 66, in forward
    x_fused[i] = x_fused[i] + self.fuse_layers[i][j](x[j])
RuntimeError: The size of tensor a (235) must match the size of tensor b (236) at non-singleton dimension 3

I would prefer getting the keypoint extraction to work, as I think it is less prone to errors with various formats etc. My dataset consists of image.png-s, which have one accompanying images.json, but Im not entirely sure this would work looking at the datasets//COCO.py file I used JSON instead of json initially as the notes in the code mention CSV or JSON, which misled me slightly.

EDIT: I ended up manually extracting the keypoints by printing the output, thanks for your help! I will close this issue now