Open EricPedley opened 1 year ago
Currently I can't find where the instance segmentation labels are. The segmentations themselves are in pointcloud_instance_xxxx.npy
files, but their values seem arbitrary (e.g., background is 108).
There are labels in sementic_segmentation_labels_xxx.json
, corresponding to the segmentation images, but they're all letters, not shapes. They're also semantic segmentations, not instance.
Here's a photo of the data and resulting mask from pointcloud_instance_0000.npy
and `rgb_0000.png assuming that background is 108:
@Vince-C156
Currently I can't find where the instance segmentation labels are. The segmentations themselves are in
pointcloud_instance_xxxx.npy
files, but their values seem arbitrary (e.g., background is 108).@Vince-C156
I don't think there are any specific instance segmentation labels, you'll have to create them yourself via combining the bounding box labels and semantic segmentation labels. Maybe we can get them straight from Isaac but idk if that'd be more or less work than just writing our own script to combine them @Vince-C156
Got it. I'll do that then
The data can be generated directly from isaac sim. On hold until #50 is done. See #49 for more information
Have gotten everything to work now. Preliminary results show very high precision but low recall. We have about 7600 * 6 = 45600 tiles for 37 classes, which is shy of the 1500 images/class recommendation, so that might be part of the problem.
Actually, looks pretty good. This was 200 epochs. Will upload weights.
cls x1 y1 x2 y2 ... xn yn
wherex1...
are coordinates of a polygon enclosing the shape. To get this info you'll need to use the semantic segmentation information included in the new dataset. There are scripts in here: https://github.com/uci-uav-forge/godot-data-gen/tree/simple-labels that can help you out with converting a segmentation mask to a polygon.