eg4000 / SKU110K_CVPR19

769 stars 182 forks source link

ValueError: invalid CSV annotations file annotations_train.csv: line 1494: y2 (799) must be higher than y1 (799) #90

Open Akash481 opened 3 years ago

Akash481 commented 3 years ago

Loading images... 100% 977/977 [00:01<00:00, 560.73it/s] Traceback (most recent call last): File "train.py", line 450, in main() File "train.py", line 394, in main train_generator, validation_generator = create_generators(args) File "train.py", line 219, in create_generators image_max_side=args.image_max_side File "/content/drive/My Drive/sku110k/SKU110K_CVPR19/object_detector_retinanet/keras_retinanet/preprocessing/csv_generator.py", line 226, in init raise_from(ValueError('invalid CSV annotations file: {}: {}'.format(csv_data_file, e)), None) File "", line 3, in raise_from ValueError: invalid CSV annotations file: annotations_train.csv: line 1494: y2 (799) must be higher than y1 (799) When I am trying to run on my custom data, it throws the above issue. Now the problem is, value at the given line inside the annotations file has not this issue. (value of y1 and y2 is different) The line number doesn't even have this value, neither one line up or down. It is reading the correct annotation file, as when I tried to print filenames, it printed names from my dataset. Please help if anyone has faced this issue

datasith commented 2 years ago

Adding to this, there's a high number of annotations where the coordinates exceed the image size. I noticed this when normalizing the coordinates for use with YOLOv5. I was getting errors for coordinates greater than 1.0 which means original_x_coordinate / image_width or original_y_coordinate / image_height had numerator > denominator. As the differences were small, I just clipped the original coordinates.

@Akash481, I doubt the authors will reply or address these issues, but if you could change the title of the issue to be more general (e.g., "List of invalid CSV annotations"), we can keep track in a single issue of all the problems we find in the annotations. And for folks searching for the specific ValueError you reported, this issue will still come up as you included it verbatim in your content.