jimmyyhwu / pose-interpreter-networks

Real-time robotic object pose estimation with deep learning
MIT License
122 stars 27 forks source link

Segmentation training issues: StopIteration #7

Closed yikakabu closed 5 years ago

yikakabu commented 5 years ago

HI! Thank you for the great job. I created a json file of my own annotated data by following the coco mask.py. The part of the json file as below:

{..., "image": [{"license": 0, "file_name": "0000007149_rgb.png", "coco_url": "", "height": 480, "width": 640, "date_captured": 1544011401.0, "camera_id": 0, "flickr_url": "", "id": 0},...], "annotations": [{"segmentation": {"size": [480, 640], "counts": "QgY37h>f0ZOe0\Od0\Od0\Od0[Of0[Oc0]O`0@3M2N2N3M2N2N2N3L3N2N2N3M2N2N2N2N1O001O0O2O0000001O000000O1000O10000000O10000O100000O0100000000O100000000O1000O10O1000000O100000000O0100000O100000000O1000O10O1000000O1000000O1000O1000O1000000O1000000O10O100000O1000000O10000000O0100000000O1000000O1000O10O100000000O100000O0100000000O10000O100000001N1000001N10001O0O101O000O2O0000001N101hMfFmLn0a0m;]OUDa0U=01N2O1O1O1N101O1O1N2O1O1N2N2MQeW3"}, "area": 34310, "pose": {"position": {"x": -0.0007623627074502259, "y": 0.34552469760243687, "z": 1.0687215939143497}, "orientation": {"x": 0.7472581634432386, "y": -0.14419430357527752, "z": 0.10985900520900949, "w": 0.6393310871202543}}, "iscrowd": 0, "image_id": 0, "bbox": [225.0, 242.0, 193.0, 219.0], "category_id": 1, "id": 0},...], "categories": [{"supercategory": "objects", "mesh": "charge_pile.stl", "id": 1, "name": "charge_pile"}]}

Then I modified the config file(drn_d_22_ChargePile.yml) and utils.py related to the object classes in "segmentation" folder. But when I run

python train.py config/drn_d_22_ChargePile.yml

There is a error "StopIteration" came out at: first_input_batch, first_target_batch = iter(val_loader).next() in the train.py:217 and RuntimeWarning: invalid value encountered in true_divide at: return np.diag(hist) / (hist.sum(1) + hist.sum(0) - np.diag(hist)) in the utils.py:25 It seems my dataset wasn't loaded correctly. Can you give me any suggestions about how to solve these problem?Thanks.

jimmyyhwu commented 5 years ago

Maybe you could use matplotlib to visualize the images that are being loaded by val_loader?

The StopIteration error probably means val_loader isn't loading any images. I would check that self.img_ids in datasets.py is correct.

yikakabu commented 5 years ago

@jimmyyhwu Thanks for your reply. You are right. That is because my annotations in the json file had wrong format. I have already solved the issue and can start training now.

yikakabu commented 5 years ago

@jimmyyhwu Could you please explain the meaning of the parameter "object_scales" in generate_pose_lists.py. And how can I get the parameter from my own CAD model? Thanks very much.

object_scales = {
        1: 0.3,
        2: 0.3,
        3: 0.2,
        4: 0.2,
        5: 0.7,
        6: 0.2,
        7: 0.2,
        8: 0.3,
        9: 0.3,
        10: 0.2,
    }
jimmyyhwu commented 5 years ago

The object_scale parameters are used to determine the position mean/cov when generating random positions here. Larger objects are usually further away from the camera. For example, the engine (scale=0.7) is much larger than the oil filter (scale=0.2). See object IDs here.

You will probably need to visually verify your rendered images and make sure the objects aren't too close or too far away.

yikakabu commented 5 years ago

@jimmyyhwu Thanks for your reply. Dose that means a appropriate distance from the object to camera? So, I should determine the scale of my model according to the size of the object. Do I understand that correctly?

jimmyyhwu commented 5 years ago

Yes, it should roughly be proportional to the size of object. You can visually verify that the objects have appropriate sizes in the rendered images.

By the way, for further follow-up on this, it would be good to open a new, separate issue, since the object_scale is not relevant to the "StopIteration" error originally referenced in this issue.

yikakabu commented 5 years ago

@jimmyyhwu I have understanded that. Thank you.