akshitac8 / OW-DETR

[CVPR 2022] Official Pytorch code for OW-DETR: Open-world Detection Transformer
228 stars 38 forks source link

Unable to run the project at all! #63

Open Honey-ashead opened 7 months ago

Honey-ashead commented 7 months ago

Firstly, the structure of the datasets used by the project is not showed correctly, which makes confused when running this project; Secondly, in the file datasets/torchvision_datasets/open_world.py, there is a method of the class OWDetection called remove_prev_class_and_unk_instances, which has some potential problems. For example, an image may contain only instances that do not belong to any valid classes, then all of the instances would be removed and this method would return no annotations, which raises a series of errors in the following codes since the shape of bboxes of the annotations in the processed images is torch.Size([0]). But I wonder why the authors did not consider the case.

lzylzylzy123456 commented 6 months ago

I have encountered this problem. Have you encountered it before?

Traceback (most recent call last): File "main_open_world.py", line 366, in main(args) File "main_open_world.py", line 290, in main model, criterion, data_loader_train, optimizer, device, epoch, args.nc_epoch, args.clip_max_norm) File "/home/dgp/code/OW-DETR/engine.py", line 41, in train_one_epoch samples, targets = prefetcher.next() File "/home/dgp/code/OW-DETR/datasets/data_prefetcher.py", line 65, in next self.preload() File "/home/dgp/code/OW-DETR/datasets/data_prefetcher.py", line 25, in preload self.next_samples, self.next_targets = next(self.loader) File "/home/dgp/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 517, in next data = self._next_data() File "/home/dgp/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 557, in _next_data data = self._dataset_fetcher.fetch(index) # may raise StopIteration File "/home/dgp/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/dgp/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/dgp/code/OW-DETR/datasets/torchvision_datasets/open_world.py", line 245, in getitem img, target = self.transforms[-1](img, target) File "/home/dgp/code/OW-DETR/datasets/transforms.py", line 276, in call image, target = t(image, target) File "/home/dgp/code/OW-DETR/datasets/transforms.py", line 234, in call return self.transforms2(img, target) File "/home/dgp/code/OW-DETR/datasets/transforms.py", line 276, in call image, target = t(image, target) File "/home/dgp/code/OW-DETR/datasets/transforms.py", line 208, in call return resize(img, target, size, self.max_size) File "/home/dgp/code/OW-DETR/datasets/transforms.py", line 126, in resize scaled_boxes = boxes * torch.as_tensor([ratio_width, ratio_height, ratio_width, ratio_height]) RuntimeError: The size of tensor a (0) must match the size of tensor b (4) at non-singleton dimension 0

Honey-ashead commented 6 months ago

Yes, that's the case! An image may contain only instances that do not belong to any valid classes during one specific task, then all the instances (annotations) in the image would be removed, which raises the above errors. Sometimes, setting a bigger batch size can alleviate this problem, but this problem might occur at any time, I think the inappropriate setting of images used for training leads to this bug.

lzylzylzy123456 commented 6 months ago

Yes, that's the case! An image may contain only instances that do not belong to any valid classes during one specific task, then all the instances (annotations) in the image would be removed, which raises the above errors. Sometimes, setting a bigger batch size can alleviate this problem, but this problem might occur at any time, I think the inappropriate setting of images used for training leads to this bug.

I previously successfully replicated the OWOD from this paper. OW-DETR and OWOD used the same dataset, the entire VOC dataset (20 classes) + MSCOCO (only using 60 of its classes). In OWOD, as I recall, the strategy was like this: taking t1 as an example, t1 only considers the 20 classes of VOC as known classes, and other labeled categories are set as unknown classes during data preprocessing. Then, as the task progresses, the known categories are gradually expanded. I also noticed in OW-DETR the use of training script commands from the original OWOD division, so I think this error is not related to the dataset. I thought it might be an environmental issue, so I installed the environment recommended in the repository, as well as a custom environment, but the error persisted. Then I tried to modify the code where the error occurred, but found that fixing one error led to another. The errors seemed endless, so I suspect that the author might have done something unreasonable in the preprocessing stage. I haven't looked into the issue you mentioned, but it seems to be what you are talking about. Additionally, in OWOD there is another issue where the dataloader needs to set NUM_WORKERS to 1 when loading data. If it's not set to 1, errors related to data reading may occur during training (and this problem can arise at any time). Now, the troublesome part is that OW-DETR is a necessary comparative experiment to run. I saw in the issue section that other people also encountered this error, but the author has not responded.

Honey-ashead commented 6 months ago

During training OW-DETR on t1, I observed that annotations of some images used are indeed removed entirely. Increasing the batch size can sometimes alleviate this issue, but not always. Considering the challenges in resolving these issues with the OW-DETR source code, if you need to replicate OW-DETR to complete your paper, you may consider directly using the results presented in the original paper. This approach can help ensure the reliability and reproducibility of your findings. If you decide to use the results from the original paper, clearly state this in your methodology section and provide appropriate citations. You can also mention the issues you encountered with the OW-DETR source code and explain why you opted to use the reported results instead.

daijiu9 commented 6 months ago

Have you successfully reproduced OW-DETR before?

daijiu9 commented 6 months ago

I want to use OW-Detr to run my own dataset, I wonder if it can be implemented?

lzylzylzy123456 commented 6 months ago

Have you successfully reproduced OW-DETR before?

I successfully ran through OWOD, but I was unable to run through OW-DETR, as the issuer stated, in open_world.py seems to have triggered an incorrect deletion, causing the annotation to be deleted and resulting in an error in transform.py

Honey-ashead commented 6 months ago

No, I have not reproduced OW-DETR since the above issues, which makes crazy. If you want to run your own dataset, you'd better carefully split your dataset for different tasks, and ensure that not all annotations of an image would be removed. The authors seems to ignore this extreme case.

lzylzylzy123456 commented 6 months ago

OW-DETR OWOD The first image shows the task classification of OW-DETR, and the second image shows the task classification of OWOD. After replacing the task classification of OW-DETR with the task classification of OWOD, task 1 can run. I haven't tried the subsequent tasks 2, 3, and 4 yet, but when I printed them, I found that there was no label deletion error after replacement, because the first task should not have had label deletion errors.

Honey-ashead commented 6 months ago

Thanks for your tips, task 1 can run!

Honey-ashead commented 6 months ago

OW-DETR OWOD The first image shows the task classification of OW-DETR, and the second image shows the task classification of OWOD. After replacing the task classification of OW-DETR with the task classification of OWOD, task 1 can run. I haven't tried the subsequent tasks 2, 3, and 4 yet, but when I printed them, I found that there was no label deletion error after replacement, because the first task should not have had label deletion errors.

I replaced "VOC_CLASS_NAMES" with that in OWOD, and t1 can run correctly at the beginning, however, when testing, errors occurred:

Traceback (most recent call last): File "main_open_world.py", line 371, in main(args) File "main_open_world.py", line 309, in main model, criterion, postprocessors, data_loader_val, base_ds, device, args.output_dir, args File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/zkx/Workspace/OW-DETR/engine.py", line 107, in evaluate for samples, targets in metric_logger.log_every(data_loader, 10, header): File "/home/zkx/Workspace/OW-DETR/util/misc.py", line 259, in log_every for obj in iterable: File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 517, in next data = self._next_data() File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1199, in _next_data return self._process_data(data) File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1225, in _process_data data.reraise() File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/_utils.py", line 429, in reraise raise self.exc_type(msg) ValueError: Caught ValueError in DataLoader worker process 1. Original Traceback (most recent call last): File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 202, in _worker_loop data = fetcher.fetch(index) File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/zkx/Workspace/OW-DETR/datasets/torchvision_datasets/open_world.py", line 226, in getitem target, instances = self.load_instances(self.imgids[index]) File "/home/zkx/Workspace/OW-DETR/datasets/torchvision_datasets/open_world.py", line 165, in load_instances category_id=VOC_COCO_CLASS_NAMES.index(cls), ValueError: tuple.index(x): x not in tuple

Should I also modify T2-T4_CLASS_NAMES?

lzylzylzy123456 commented 6 months ago

OW-DETR OWOD The first image shows the task classification of OW-DETR, and the second image shows the task classification of OWOD. After replacing the task classification of OW-DETR with the task classification of OWOD, task 1 can run. I haven't tried the subsequent tasks 2, 3, and 4 yet, but when I printed them, I found that there was no label deletion error after replacement, because the first task should not have had label deletion errors.

I replaced "VOC_CLASS_NAMES" with that in OWOD, and t1 can run correctly at the beginning, however, when testing, errors occurred:

Traceback (most recent call last): File "main_open_world.py", line 371, in main(args) File "main_open_world.py", line 309, in main model, criterion, postprocessors, data_loader_val, base_ds, device, args.output_dir, args File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, kwargs) File "/home/zkx/Workspace/OW-DETR/engine.py", line 107, in evaluate for samples, targets in metric_logger.log_every(data_loader, 10, header): File "/home/zkx/Workspace/OW-DETR/util/misc.py", line 259, in log_every for obj in iterable: File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 517, in next data = self._next_data() File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1199, in _next_data return self._process_data(data) File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1225, in _process_data data.reraise() File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/_utils.py", line 429, in reraise raise self.exc_type(msg) ValueError: Caught ValueError in DataLoader worker process 1. Original Traceback (most recent call last): File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 202, in _worker_loop data = fetcher.fetch(index) File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/zkx/Workspace/OW-DETR/datasets/torchvision_datasets/open_world.py", line 226, in getitem** target, instances = self.load_instances(self.imgids[index]) File "/home/zkx/Workspace/OW-DETR/datasets/torchvision_datasets/open_world.py", line 165, in load_instances category_id=VOC_COCO_CLASS_NAMES.index(cls), ValueError: tuple.index(x): x not in tuple

Should I also modify T2-T4_CLASS_NAMES?

I also encountered a testing error where there were not two epochs in the code that would be tested. After my testing, I found that there was an issue with the source code. You should copy line 97 of engine. py_ Change evaluator=OWEvalator (base_ds, ioutypes) to coco Evaluator=OWEvaluator (base_ds, iou_types, args=args), this args is necessary! And you need to replace all np.bool in the code with bool. I think if you want to replicate the effect of OWOD data segmentation in OW-DETR, then you should modify all tasks because I need to compare the effect with OWOD.

lzylzylzy123456 commented 6 months ago

OW-DETR OWOD The first image shows the task classification of OW-DETR, and the second image shows the task classification of OWOD. After replacing the task classification of OW-DETR with the task classification of OWOD, task 1 can run. I haven't tried the subsequent tasks 2, 3, and 4 yet, but when I printed them, I found that there was no label deletion error after replacement, because the first task should not have had label deletion errors.

I replaced "VOC_CLASS_NAMES" with that in OWOD, and t1 can run correctly at the beginning, however, when testing, errors occurred:

Traceback (most recent call last): File "main_open_world.py", line 371, in main(args) File "main_open_world.py", line 309, in main model, criterion, postprocessors, data_loader_val, base_ds, device, args.output_dir, args File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, kwargs) File "/home/zkx/Workspace/OW-DETR/engine.py", line 107, in evaluate for samples, targets in metric_logger.log_every(data_loader, 10, header): File "/home/zkx/Workspace/OW-DETR/util/misc.py", line 259, in log_every for obj in iterable: File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 517, in next data = self._next_data() File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1199, in _next_data return self._process_data(data) File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1225, in _process_data data.reraise() File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/_utils.py", line 429, in reraise raise self.exc_type(msg) ValueError: Caught ValueError in DataLoader worker process 1. Original Traceback (most recent call last): File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 202, in _worker_loop data = fetcher.fetch(index) File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/zkx/anaconda3/envs/owdetr/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/zkx/Workspace/OW-DETR/datasets/torchvision_datasets/open_world.py", line 226, in getitem** target, instances = self.load_instances(self.imgids[index]) File "/home/zkx/Workspace/OW-DETR/datasets/torchvision_datasets/open_world.py", line 165, in load_instances category_id=VOC_COCO_CLASS_NAMES.index(cls), ValueError: tuple.index(x): x not in tuple

Should I also modify T2-T4_CLASS_NAMES?

If you have modified all the tasks according to the OWOD segmentation mode, I think you can first adjust the training and testing datasets to be smaller, such as training 40 images for each task, testing 40 images, running the experiment completely, checking for any errors, and then analyzing based on your problem.

lzylzylzy123456 commented 6 months ago

What I mean by segmentation here is data segmentation

Honey-ashead commented 6 months ago

Thanks for your advice! I would follow your suggestions and I'll give my feedback here again.

lzylzylzy123456 commented 6 months ago

In OW-DETR, the authors declared that they used the replay mathod to prevent the model from catastrophically forgetting what it have learnt, but I cannot find the corresponding codes in their given codes? Do you know how they realized that?

The original paper states, 'For finetuning during incremental step, the learning rate is reduced by a factor of 10 and trained using a set of 50 stored exemplars per known class.' I also couldn't find the related code. I believe the original text simply used tx_ft for fine-tuning based on sample replay, as I also didn't see any related code in OWOD.

lzylzylzy123456 commented 6 months ago

owod t1_train all_task_test Dataset OWDetection Number of datapoints: 16551 Root location: /home/dgp/code/OWOD/datasets/VOC2007 [['train'], Compose( <datasets.transforms.RandomHorizontalFlip object at 0x7f329c1b4ad0> <datasets.transforms.RandomSelect object at 0x7f3286e91690> Compose( <datasets.transforms.ToTensor object at 0x7f329065aed0> <datasets.transforms.Normalize object at 0x7f328fd887d0> ) )] Dataset OWDetection Number of datapoints: 10246 Root location: /home/dgp/code/OWOD/datasets/VOC2007 [['test'], Compose( <datasets.transforms.RandomResize object at 0x7f3286e919d0> Compose( <datasets.transforms.ToTensor object at 0x7f3286e91bd0> <datasets.transforms.Normalize object at 0x7f3286e91b90> ) )]

testing data details 81 80 ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor', 'truck', 'traffic light', 'fire hydrant', 'stop sign', 'parking meter', 'bench', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', 'baseball bat', 'baseball glove', 'skateboard', 'surfboard', 'tennis racket', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza', 'donut', 'cake', 'bed', 'toilet', 'laptop', 'mouse', 'remote', 'keyboard', 'cell phone', 'book', 'clock', 'vase', 'scissors', 'teddy bear', 'hair drier', 'toothbrush', 'wine glass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'unknown') Test: [ 0/5123] eta: 0:30:16 time: 0.3545 data: 0.0432 max mem: 1049 Test: [ 10/5123] eta: 0:11:01 time: 0.1293 data: 0.0271 max mem: 1464 Test: [ 20/5123] eta: 0:10:02 time: 0.1061 data: 0.0252 max mem: 1464 Test: [ 30/5123] eta: 0:09:43 time: 0.1065 data: 0.0249 max mem: 1472 Test: [ 40/5123] eta: 0:09:40 time: 0.1102 data: 0.0266 max mem: 1472 Test: [ 50/5123] eta: 0:09:40 time: 0.1140 data: 0.0276 max mem: 1585 Test: [ 60/5123] eta: 0:09:32 time: 0.1110 data: 0.0259 max mem: 1585 Test: [ 70/5123] eta: 0:09:30 time: 0.1094 data: 0.0262 max mem: 1585 Test: [ 80/5123] eta: 0:09:31 time: 0.1137 data: 0.0276 max mem: 1585 Test: [ 90/5123] eta: 0:09:24 time: 0.1089 data: 0.0262 max mem: 1585 Test: [ 100/5123] eta: 0:09:29 time: 0.1133 data: 0.0303 max mem: 1585 Test: [ 110/5123] eta: 0:09:24 time: 0.1146 data: 0.0306 max mem: 1585 Test: [ 120/5123] eta: 0:09:23 time: 0.1096 data: 0.0275 max mem: 1585 Test: [ 130/5123] eta: 0:09:20 time: 0.1107 data: 0.0271 max mem: 1585 Test: [ 140/5123] eta: 0:09:17 time: 0.1070 data: 0.0242 max mem: 1585 Test: [ 150/5123] eta: 0:09:16 time: 0.1089 data: 0.0246 max mem: 1585 Test: [ 160/5123] eta: 0:09:13 time: 0.1088 data: 0.0249 max mem: 1585 Test: [ 170/5123] eta: 0:09:06 time: 0.0986 data: 0.0228 max mem: 1585 Test: [ 180/5123] eta: 0:09:03 time: 0.0980 data: 0.0225 max mem: 1585 Test: [ 190/5123] eta: 0:09:02 time: 0.1076 data: 0.0249 max mem: 1585 Test: [ 200/5123] eta: 0:08:57 time: 0.1027 data: 0.0249 max mem: 1585 Test: [ 210/5123] eta: 0:08:58 time: 0.1062 data: 0.0256 max mem: 1585 Test: [ 220/5123] eta: 0:08:58 time: 0.1160 data: 0.0268 max mem: 1586 Test: [ 230/5123] eta: 0:08:55 time: 0.1074 data: 0.0253 max mem: 1586 Test: [ 240/5123] eta: 0:08:53 time: 0.1023 data: 0.0256 max mem: 1586 Test: [ 250/5123] eta: 0:08:50 time: 0.1014 data: 0.0245 max mem: 1586 Test: [ 260/5123] eta: 0:08:49 time: 0.1035 data: 0.0238 max mem: 1586 Test: [ 270/5123] eta: 0:08:47 time: 0.1072 data: 0.0260 max mem: 1586 Test: [ 280/5123] eta: 0:08:43 time: 0.1001 data: 0.0241 max mem: 1586 Test: [ 290/5123] eta: 0:08:43 time: 0.1025 data: 0.0248 max mem: 1586 Test: [ 300/5123] eta: 0:08:42 time: 0.1097 data: 0.0268 max mem: 1586 Test: [ 310/5123] eta: 0:08:41 time: 0.1095 data: 0.0261 max mem: 1586 Test: [ 320/5123] eta: 0:08:39 time: 0.1067 data: 0.0258 max mem: 1586 Test: [ 330/5123] eta: 0:08:37 time: 0.1025 data: 0.0243 max mem: 1586 Test: [ 340/5123] eta: 0:08:34 time: 0.0992 data: 0.0252 max mem: 1586 Test: [ 350/5123] eta: 0:08:33 time: 0.1004 data: 0.0259 max mem: 1586 Test: [ 360/5123] eta: 0:08:31 time: 0.1014 data: 0.0236 max mem: 1586 Test: [ 370/5123] eta: 0:08:30 time: 0.1053 data: 0.0243 max mem: 1586 Test: [ 380/5123] eta: 0:08:28 time: 0.1058 data: 0.0253 max mem: 1586 Test: [ 390/5123] eta: 0:08:27 time: 0.1018 data: 0.0234 max mem: 1586 Test: [ 400/5123] eta: 0:08:25 time: 0.1029 data: 0.0233 max mem: 1586 Test: [ 410/5123] eta: 0:08:25 time: 0.1091 data: 0.0255 max mem: 1586 Test: [ 420/5123] eta: 0:08:23 time: 0.1096 data: 0.0264 max mem: 1586 Test: [ 430/5123] eta: 0:08:22 time: 0.1030 data: 0.0249 max mem: 1586 Test: [ 440/5123] eta: 0:08:20 time: 0.1030 data: 0.0239 max mem: 1586 Test: [ 450/5123] eta: 0:08:19 time: 0.1033 data: 0.0243 max mem: 1586 Test: [ 460/5123] eta: 0:08:19 time: 0.1100 data: 0.0275 max mem: 1586 Test: [ 470/5123] eta: 0:08:18 time: 0.1134 data: 0.0285 max mem: 1586 Test: [ 480/5123] eta: 0:08:17 time: 0.1091 data: 0.0258 max mem: 1586 Test: [ 490/5123] eta: 0:08:16 time: 0.1084 data: 0.0263 max mem: 1586 Test: [ 500/5123] eta: 0:08:15 time: 0.1078 data: 0.0264 max mem: 1586 Test: [ 510/5123] eta: 0:08:14 time: 0.1070 data: 0.0258 max mem: 1586 Test: [ 520/5123] eta: 0:08:13 time: 0.1084 data: 0.0265 max mem: 1586 Test: [ 530/5123] eta: 0:08:12 time: 0.1058 data: 0.0241 max mem: 1586 Test: [ 540/5123] eta: 0:08:10 time: 0.1011 data: 0.0232 max mem: 1586 Test: [ 550/5123] eta: 0:08:08 time: 0.1011 data: 0.0248 max mem: 1586 Test: [ 560/5123] eta: 0:08:07 time: 0.1037 data: 0.0248 max mem: 1586 Test: [ 570/5123] eta: 0:08:06 time: 0.1059 data: 0.0247 max mem: 1586 Test: [ 580/5123] eta: 0:08:04 time: 0.1026 data: 0.0239 max mem: 1586 Test: [ 590/5123] eta: 0:08:03 time: 0.1011 data: 0.0236 max mem: 1586 Test: [ 600/5123] eta: 0:08:03 time: 0.1096 data: 0.0259 max mem: 1586 Test: [ 610/5123] eta: 0:08:01 time: 0.1085 data: 0.0261 max mem: 1586 Test: [ 620/5123] eta: 0:08:00 time: 0.1039 data: 0.0243 max mem: 1586 Test: [ 630/5123] eta: 0:08:00 time: 0.1102 data: 0.0254 max mem: 1586 Test: [ 640/5123] eta: 0:07:59 time: 0.1114 data: 0.0264 max mem: 1586 Test: [ 650/5123] eta: 0:07:57 time: 0.1055 data: 0.0257 max mem: 1586 Test: [ 660/5123] eta: 0:07:56 time: 0.1054 data: 0.0253 max mem: 1586 Test: [ 670/5123] eta: 0:07:55 time: 0.1038 data: 0.0243 max mem: 1586 Test: [ 680/5123] eta: 0:07:53 time: 0.1006 data: 0.0238 max mem: 1586 Test: [ 690/5123] eta: 0:07:52 time: 0.1022 data: 0.0240 max mem: 1586 Test: [ 700/5123] eta: 0:07:51 time: 0.1072 data: 0.0251 max mem: 1586 Test: [ 710/5123] eta: 0:07:50 time: 0.1095 data: 0.0254 max mem: 1586 Test: [ 720/5123] eta: 0:07:49 time: 0.1067 data: 0.0244 max mem: 1586 Test: [ 730/5123] eta: 0:07:49 time: 0.1162 data: 0.0278 max mem: 1586 Test: [ 740/5123] eta: 0:07:49 time: 0.1205 data: 0.0294 max mem: 1586 Test: [ 750/5123] eta: 0:07:48 time: 0.1109 data: 0.0253 max mem: 1586 Test: [ 760/5123] eta: 0:07:47 time: 0.1072 data: 0.0235 max mem: 1586 Test: [ 770/5123] eta: 0:07:45 time: 0.1058 data: 0.0236 max mem: 1586 Test: [ 780/5123] eta: 0:07:44 time: 0.1006 data: 0.0236 max mem: 1586 Test: [ 790/5123] eta: 0:07:43 time: 0.1064 data: 0.0261 max mem: 1586 Test: [ 800/5123] eta: 0:07:42 time: 0.1109 data: 0.0269 max mem: 1586 Test: [ 810/5123] eta: 0:07:41 time: 0.1071 data: 0.0255 max mem: 1586 Test: [ 820/5123] eta: 0:07:40 time: 0.1069 data: 0.0246 max mem: 1586 Test: [ 830/5123] eta: 0:07:39 time: 0.1044 data: 0.0245 max mem: 1586 Test: [ 840/5123] eta: 0:07:37 time: 0.1034 data: 0.0256 max mem: 1586 Test: [ 850/5123] eta: 0:07:36 time: 0.1064 data: 0.0254 max mem: 1586 Test: [ 860/5123] eta: 0:07:35 time: 0.1067 data: 0.0246 max mem: 1586 Test: [ 870/5123] eta: 0:07:34 time: 0.1034 data: 0.0240 max mem: 1586 Test: [ 880/5123] eta: 0:07:33 time: 0.1008 data: 0.0236 max mem: 1586 Test: [ 890/5123] eta: 0:07:31 time: 0.1029 data: 0.0241 max mem: 1586 Test: [ 900/5123] eta: 0:07:30 time: 0.1052 data: 0.0256 max mem: 1586 Test: [ 910/5123] eta: 0:07:28 time: 0.0985 data: 0.0249 max mem: 1586 Test: [ 920/5123] eta: 0:07:27 time: 0.0973 data: 0.0239 max mem: 1586 Test: [ 930/5123] eta: 0:07:26 time: 0.1004 data: 0.0236 max mem: 1586 Test: [ 940/5123] eta: 0:07:24 time: 0.0996 data: 0.0238 max mem: 1586 Test: [ 950/5123] eta: 0:07:23 time: 0.1015 data: 0.0259 max mem: 1586 Test: [ 960/5123] eta: 0:07:23 time: 0.1104 data: 0.0279 max mem: 1591 Test: [ 970/5123] eta: 0:07:22 time: 0.1117 data: 0.0262 max mem: 1591 Test: [ 980/5123] eta: 0:07:20 time: 0.1048 data: 0.0239 max mem: 1591 Test: [ 990/5123] eta: 0:07:19 time: 0.1042 data: 0.0242 max mem: 1591 Test: [1000/5123] eta: 0:07:18 time: 0.1039 data: 0.0235 max mem: 1591 Test: [1010/5123] eta: 0:07:17 time: 0.1025 data: 0.0238 max mem: 1591 Test: [1020/5123] eta: 0:07:16 time: 0.1042 data: 0.0244 max mem: 1591 Test: [1030/5123] eta: 0:07:15 time: 0.1086 data: 0.0250 max mem: 1591 Test: [1040/5123] eta: 0:07:14 time: 0.1049 data: 0.0246 max mem: 1591 Test: [1050/5123] eta: 0:07:13 time: 0.1077 data: 0.0247 max mem: 1591 Test: [1060/5123] eta: 0:07:12 time: 0.1119 data: 0.0254 max mem: 1591 Test: [1070/5123] eta: 0:07:11 time: 0.1037 data: 0.0240 max mem: 1591 Test: [1080/5123] eta: 0:07:09 time: 0.1000 data: 0.0235 max mem: 1591 Test: [1090/5123] eta: 0:07:08 time: 0.1015 data: 0.0247 max mem: 1591 Test: [1100/5123] eta: 0:07:07 time: 0.1023 data: 0.0246 max mem: 1591 Test: [1110/5123] eta: 0:07:06 time: 0.1029 data: 0.0233 max mem: 1591 Test: [1120/5123] eta: 0:07:04 time: 0.1017 data: 0.0232 max mem: 1591 Test: [1130/5123] eta: 0:07:03 time: 0.1038 data: 0.0247 max mem: 1591 Test: [1140/5123] eta: 0:07:02 time: 0.1072 data: 0.0262 max mem: 1591 Test: [1150/5123] eta: 0:07:01 time: 0.1056 data: 0.0258 max mem: 1591 Test: [1160/5123] eta: 0:07:00 time: 0.1062 data: 0.0250 max mem: 1591 Test: [1170/5123] eta: 0:06:59 time: 0.1054 data: 0.0248 max mem: 1591 Test: [1180/5123] eta: 0:06:58 time: 0.1047 data: 0.0251 max mem: 1591 Test: [1190/5123] eta: 0:06:57 time: 0.1092 data: 0.0271 max mem: 1591 Test: [1200/5123] eta: 0:06:56 time: 0.1094 data: 0.0264 max mem: 1591 Test: [1210/5123] eta: 0:06:55 time: 0.1086 data: 0.0247 max mem: 1591 Test: [1220/5123] eta: 0:06:54 time: 0.1059 data: 0.0242 max mem: 1591 Test: [1230/5123] eta: 0:06:53 time: 0.1054 data: 0.0244 max mem: 1591 Test: [1240/5123] eta: 0:06:52 time: 0.1117 data: 0.0257 max mem: 1591 Test: [1250/5123] eta: 0:06:51 time: 0.1099 data: 0.0255 max mem: 1591 Test: [1260/5123] eta: 0:06:50 time: 0.1013 data: 0.0252 max mem: 1591 Test: [1270/5123] eta: 0:06:49 time: 0.1005 data: 0.0248 max mem: 1591 Test: [1280/5123] eta: 0:06:48 time: 0.1072 data: 0.0239 max mem: 1591 Test: [1290/5123] eta: 0:06:46 time: 0.1055 data: 0.0245 max mem: 1591 Test: [1300/5123] eta: 0:06:45 time: 0.1005 data: 0.0236 max mem: 1591 Test: [1310/5123] eta: 0:06:44 time: 0.1033 data: 0.0229 max mem: 1591 Test: [1320/5123] eta: 0:06:43 time: 0.1032 data: 0.0245 max mem: 1591 Test: [1330/5123] eta: 0:06:42 time: 0.1044 data: 0.0248 max mem: 1591 Test: [1340/5123] eta: 0:06:41 time: 0.1043 data: 0.0235 max mem: 1591 Test: [1350/5123] eta: 0:06:40 time: 0.1071 data: 0.0243 max mem: 1591 Test: [1360/5123] eta: 0:06:39 time: 0.1200 data: 0.0281 max mem: 1591 Test: [1370/5123] eta: 0:06:38 time: 0.1145 data: 0.0275 max mem: 1591 Test: [1380/5123] eta: 0:06:37 time: 0.1075 data: 0.0264 max mem: 1591 Test: [1390/5123] eta: 0:06:36 time: 0.1101 data: 0.0265 max mem: 1591 Test: [1400/5123] eta: 0:06:35 time: 0.1124 data: 0.0265 max mem: 1591 Test: [1410/5123] eta: 0:06:34 time: 0.1121 data: 0.0272 max mem: 1591 Test: [1420/5123] eta: 0:06:33 time: 0.1077 data: 0.0258 max mem: 1591 Test: [1430/5123] eta: 0:06:33 time: 0.1105 data: 0.0248 max mem: 1591 Test: [1440/5123] eta: 0:06:31 time: 0.1069 data: 0.0229 max mem: 1591 Test: [1450/5123] eta: 0:06:30 time: 0.1023 data: 0.0216 max mem: 1591 Test: [1460/5123] eta: 0:06:29 time: 0.1056 data: 0.0234 max mem: 1591 Test: [1470/5123] eta: 0:06:28 time: 0.1066 data: 0.0250 max mem: 1591 Test: [1480/5123] eta: 0:06:27 time: 0.1022 data: 0.0238 max mem: 1591 Test: [1490/5123] eta: 0:06:26 time: 0.0999 data: 0.0228 max mem: 1591 Test: [1500/5123] eta: 0:06:25 time: 0.1035 data: 0.0246 max mem: 1591 Test: [1510/5123] eta: 0:06:24 time: 0.1101 data: 0.0261 max mem: 1591 Test: [1520/5123] eta: 0:06:22 time: 0.1073 data: 0.0252 max mem: 1591 Test: [1530/5123] eta: 0:06:21 time: 0.1045 data: 0.0250 max mem: 1591 Test: [1540/5123] eta: 0:06:20 time: 0.1087 data: 0.0255 max mem: 1591 Test: [1550/5123] eta: 0:06:20 time: 0.1166 data: 0.0273 max mem: 1592 Test: [1560/5123] eta: 0:06:19 time: 0.1145 data: 0.0262 max mem: 1592 Test: [1570/5123] eta: 0:06:18 time: 0.1053 data: 0.0245 max mem: 1592 Test: [1580/5123] eta: 0:06:17 time: 0.1057 data: 0.0249 max mem: 1592 Test: [1590/5123] eta: 0:06:16 time: 0.1062 data: 0.0253 max mem: 1592 Test: [1600/5123] eta: 0:06:15 time: 0.1087 data: 0.0259 max mem: 1592 Test: [1610/5123] eta: 0:06:13 time: 0.1078 data: 0.0249 max mem: 1592 Test: [1620/5123] eta: 0:06:13 time: 0.1115 data: 0.0271 max mem: 1592 Test: [1630/5123] eta: 0:06:12 time: 0.1171 data: 0.0279 max mem: 1592 Test: [1640/5123] eta: 0:06:11 time: 0.1087 data: 0.0246 max mem: 1592 Test: [1650/5123] eta: 0:06:10 time: 0.1046 data: 0.0235 max mem: 1592 Test: [1660/5123] eta: 0:06:08 time: 0.1047 data: 0.0237 max mem: 1592 Test: [1670/5123] eta: 0:06:07 time: 0.1009 data: 0.0230 max mem: 1592 Test: [1680/5123] eta: 0:06:06 time: 0.1026 data: 0.0234 max mem: 1592 Test: [1690/5123] eta: 0:06:05 time: 0.1122 data: 0.0258 max mem: 1592 Test: [1700/5123] eta: 0:06:04 time: 0.1090 data: 0.0254 max mem: 1592 Test: [1710/5123] eta: 0:06:03 time: 0.0957 data: 0.0228 max mem: 1592 Test: [1720/5123] eta: 0:06:02 time: 0.0967 data: 0.0230 max mem: 1592 Test: [1730/5123] eta: 0:06:01 time: 0.1086 data: 0.0255 max mem: 1592 Test: [1740/5123] eta: 0:06:00 time: 0.1092 data: 0.0258 max mem: 1592 Test: [1750/5123] eta: 0:05:59 time: 0.1058 data: 0.0253 max mem: 1592 Test: [1760/5123] eta: 0:05:58 time: 0.1112 data: 0.0256 max mem: 1592 Test: [1770/5123] eta: 0:05:57 time: 0.1129 data: 0.0259 max mem: 1592 Test: [1780/5123] eta: 0:05:55 time: 0.1061 data: 0.0260 max mem: 1592 Test: [1790/5123] eta: 0:05:54 time: 0.1037 data: 0.0256 max mem: 1592 Test: [1800/5123] eta: 0:05:54 time: 0.1127 data: 0.0263 max mem: 1592 Test: [1810/5123] eta: 0:05:52 time: 0.1083 data: 0.0250 max mem: 1592 Test: [1820/5123] eta: 0:05:51 time: 0.1083 data: 0.0256 max mem: 1592 Test: [1830/5123] eta: 0:05:50 time: 0.1094 data: 0.0263 max mem: 1592 Test: [1840/5123] eta: 0:05:49 time: 0.1084 data: 0.0263 max mem: 1592 Test: [1850/5123] eta: 0:05:48 time: 0.1120 data: 0.0272 max mem: 1592 Test: [1860/5123] eta: 0:05:47 time: 0.1077 data: 0.0263 max mem: 1592 Test: [1870/5123] eta: 0:05:46 time: 0.1044 data: 0.0257 max mem: 1592 Test: [1880/5123] eta: 0:05:45 time: 0.1029 data: 0.0253 max mem: 1592 Test: [1890/5123] eta: 0:05:44 time: 0.1058 data: 0.0245 max mem: 1592 Test: [1900/5123] eta: 0:05:43 time: 0.1147 data: 0.0270 max mem: 1592 Test: [1910/5123] eta: 0:05:42 time: 0.1129 data: 0.0280 max mem: 1592 Test: [1920/5123] eta: 0:05:41 time: 0.1075 data: 0.0254 max mem: 1592 Test: [1930/5123] eta: 0:05:40 time: 0.1046 data: 0.0238 max mem: 1592 Test: [1940/5123] eta: 0:05:39 time: 0.0971 data: 0.0227 max mem: 1592 Test: [1950/5123] eta: 0:05:38 time: 0.1039 data: 0.0243 max mem: 1592 Test: [1960/5123] eta: 0:05:37 time: 0.1082 data: 0.0253 max mem: 1592 Test: [1970/5123] eta: 0:05:35 time: 0.1009 data: 0.0230 max mem: 1592 Test: [1980/5123] eta: 0:05:34 time: 0.1018 data: 0.0226 max mem: 1592 Test: [1990/5123] eta: 0:05:33 time: 0.1095 data: 0.0252 max mem: 1592 Test: [2000/5123] eta: 0:05:32 time: 0.1119 data: 0.0252 max mem: 1592 Test: [2010/5123] eta: 0:05:31 time: 0.1075 data: 0.0248 max mem: 1592 Test: [2020/5123] eta: 0:05:30 time: 0.1041 data: 0.0250 max mem: 1592 Test: [2030/5123] eta: 0:05:29 time: 0.1010 data: 0.0238 max mem: 1592 Test: [2040/5123] eta: 0:05:28 time: 0.0979 data: 0.0232 max mem: 1592 Test: [2050/5123] eta: 0:05:27 time: 0.0986 data: 0.0235 max mem: 1592 Test: [2060/5123] eta: 0:05:26 time: 0.1039 data: 0.0241 max mem: 1592 Test: [2070/5123] eta: 0:05:24 time: 0.1068 data: 0.0243 max mem: 1592 Test: [2080/5123] eta: 0:05:23 time: 0.1081 data: 0.0266 max mem: 1592 Test: [2090/5123] eta: 0:05:22 time: 0.1012 data: 0.0263 max mem: 1592 Test: [2100/5123] eta: 0:05:21 time: 0.1037 data: 0.0253 max mem: 1592 Test: [2110/5123] eta: 0:05:20 time: 0.1171 data: 0.0287 max mem: 1592 Test: [2120/5123] eta: 0:05:19 time: 0.1143 data: 0.0272 max mem: 1592 Test: [2130/5123] eta: 0:05:18 time: 0.1106 data: 0.0247 max mem: 1592 Test: [2140/5123] eta: 0:05:17 time: 0.1163 data: 0.0263 max mem: 1592 Test: [2150/5123] eta: 0:05:16 time: 0.1119 data: 0.0260 max mem: 1592 Test: [2160/5123] eta: 0:05:15 time: 0.1027 data: 0.0243 max mem: 1592 Test: [2170/5123] eta: 0:05:14 time: 0.1052 data: 0.0247 max mem: 1592 Test: [2180/5123] eta: 0:05:13 time: 0.1013 data: 0.0244 max mem: 1592 Test: [2190/5123] eta: 0:05:12 time: 0.1007 data: 0.0236 max mem: 1592 Test: [2200/5123] eta: 0:05:11 time: 0.1035 data: 0.0233 max mem: 1592 Test: [2210/5123] eta: 0:05:10 time: 0.1008 data: 0.0237 max mem: 1592 Test: [2220/5123] eta: 0:05:09 time: 0.1044 data: 0.0240 max mem: 1592 Test: [2230/5123] eta: 0:05:07 time: 0.1064 data: 0.0240 max mem: 1592 Test: [2240/5123] eta: 0:05:06 time: 0.1060 data: 0.0253 max mem: 1592 Test: [2250/5123] eta: 0:05:06 time: 0.1131 data: 0.0261 max mem: 1592 Test: [2260/5123] eta: 0:05:05 time: 0.1150 data: 0.0263 max mem: 1592 Test: [2270/5123] eta: 0:05:03 time: 0.1083 data: 0.0255 max mem: 1592 Test: [2280/5123] eta: 0:05:02 time: 0.1079 data: 0.0254 max mem: 1592 Test: [2290/5123] eta: 0:05:01 time: 0.1037 data: 0.0254 max mem: 1592 Test: [2300/5123] eta: 0:05:00 time: 0.0985 data: 0.0248 max mem: 1592 Test: [2310/5123] eta: 0:04:59 time: 0.1069 data: 0.0288 max mem: 1592 Test: [2320/5123] eta: 0:04:58 time: 0.1211 data: 0.0343 max mem: 1592 Test: [2330/5123] eta: 0:04:57 time: 0.1186 data: 0.0332 max mem: 1592 Test: [2340/5123] eta: 0:04:57 time: 0.1214 data: 0.0332 max mem: 1600 Test: [2350/5123] eta: 0:04:56 time: 0.1228 data: 0.0325 max mem: 1600 Test: [2360/5123] eta: 0:04:55 time: 0.1175 data: 0.0286 max mem: 1600 Test: [2370/5123] eta: 0:04:54 time: 0.1120 data: 0.0255 max mem: 1600 Test: [2380/5123] eta: 0:04:52 time: 0.1049 data: 0.0245 max mem: 1600 Test: [2390/5123] eta: 0:04:51 time: 0.1071 data: 0.0252 max mem: 1600 Test: [2400/5123] eta: 0:04:51 time: 0.1155 data: 0.0285 max mem: 1600 Test: [2410/5123] eta: 0:04:49 time: 0.1155 data: 0.0288 max mem: 1600 Test: [2420/5123] eta: 0:04:49 time: 0.1130 data: 0.0277 max mem: 1600 Test: [2430/5123] eta: 0:04:47 time: 0.1128 data: 0.0262 max mem: 1600 Test: [2440/5123] eta: 0:04:46 time: 0.1039 data: 0.0239 max mem: 1600 Test: [2450/5123] eta: 0:04:45 time: 0.1053 data: 0.0255 max mem: 1600 Test: [2460/5123] eta: 0:04:44 time: 0.1042 data: 0.0244 max mem: 1600 Test: [2470/5123] eta: 0:04:43 time: 0.1053 data: 0.0247 max mem: 1600 Test: [2480/5123] eta: 0:04:42 time: 0.1216 data: 0.0286 max mem: 1600 Test: [2490/5123] eta: 0:04:41 time: 0.1288 data: 0.0307 max mem: 1600 Test: [2500/5123] eta: 0:04:40 time: 0.1211 data: 0.0303 max mem: 1600 Test: [2510/5123] eta: 0:04:40 time: 0.1184 data: 0.0305 max mem: 1600 Test: [2520/5123] eta: 0:04:39 time: 0.1198 data: 0.0315 max mem: 1600 Test: [2530/5123] eta: 0:04:38 time: 0.1254 data: 0.0346 max mem: 1600 Test: [2540/5123] eta: 0:04:37 time: 0.1276 data: 0.0358 max mem: 1600 Test: [2550/5123] eta: 0:04:36 time: 0.1193 data: 0.0343 max mem: 1600 Test: [2560/5123] eta: 0:04:35 time: 0.1144 data: 0.0311 max mem: 1600 Test: [2570/5123] eta: 0:04:34 time: 0.1221 data: 0.0304 max mem: 1600 Test: [2580/5123] eta: 0:04:33 time: 0.1216 data: 0.0298 max mem: 1600 Test: [2590/5123] eta: 0:04:32 time: 0.1180 data: 0.0288 max mem: 1600 Test: [2600/5123] eta: 0:04:31 time: 0.1139 data: 0.0290 max mem: 1600 Test: [2610/5123] eta: 0:04:30 time: 0.1072 data: 0.0268 max mem: 1600 Test: [2620/5123] eta: 0:04:29 time: 0.1103 data: 0.0267 max mem: 1600 Test: [2630/5123] eta: 0:04:28 time: 0.1148 data: 0.0278 max mem: 1600 Test: [2640/5123] eta: 0:04:27 time: 0.1117 data: 0.0276 max mem: 1600 Test: [2650/5123] eta: 0:04:26 time: 0.1079 data: 0.0273 max mem: 1600 Test: [2660/5123] eta: 0:04:25 time: 0.1098 data: 0.0285 max mem: 1600 Test: [2670/5123] eta: 0:04:24 time: 0.1096 data: 0.0293 max mem: 1600 Test: [2680/5123] eta: 0:04:23 time: 0.1090 data: 0.0281 max mem: 1600 Test: [2690/5123] eta: 0:04:21 time: 0.1109 data: 0.0279 max mem: 1600 Test: [2700/5123] eta: 0:04:20 time: 0.1081 data: 0.0279 max mem: 1600 Test: [2710/5123] eta: 0:04:19 time: 0.1085 data: 0.0283 max mem: 1600 Test: [2720/5123] eta: 0:04:18 time: 0.1122 data: 0.0291 max mem: 1600 Test: [2730/5123] eta: 0:04:17 time: 0.1117 data: 0.0292 max mem: 1600 Test: [2740/5123] eta: 0:04:16 time: 0.1190 data: 0.0312 max mem: 1600 Test: [2750/5123] eta: 0:04:15 time: 0.1295 data: 0.0347 max mem: 1600 Test: [2760/5123] eta: 0:04:14 time: 0.1251 data: 0.0349 max mem: 1600 Test: [2770/5123] eta: 0:04:13 time: 0.1094 data: 0.0303 max mem: 1600 Test: [2780/5123] eta: 0:04:13 time: 0.1201 data: 0.0315 max mem: 1600 Test: [2790/5123] eta: 0:04:12 time: 0.1323 data: 0.0331 max mem: 1600 Test: [2800/5123] eta: 0:04:11 time: 0.1232 data: 0.0314 max mem: 1600 Test: [2810/5123] eta: 0:04:10 time: 0.1133 data: 0.0316 max mem: 1600 Test: [2820/5123] eta: 0:04:09 time: 0.1173 data: 0.0344 max mem: 1600 Test: [2830/5123] eta: 0:04:08 time: 0.1207 data: 0.0333 max mem: 1600 Test: [2840/5123] eta: 0:04:06 time: 0.1090 data: 0.0296 max mem: 1600 Test: [2850/5123] eta: 0:04:05 time: 0.1106 data: 0.0306 max mem: 1600 Test: [2860/5123] eta: 0:04:04 time: 0.1146 data: 0.0315 max mem: 1600 Test: [2870/5123] eta: 0:04:03 time: 0.1071 data: 0.0296 max mem: 1600 Test: [2880/5123] eta: 0:04:02 time: 0.1092 data: 0.0289 max mem: 1600 Test: [2890/5123] eta: 0:04:01 time: 0.1209 data: 0.0325 max mem: 1600 Test: [2900/5123] eta: 0:04:00 time: 0.1301 data: 0.0346 max mem: 1600 Test: [2910/5123] eta: 0:03:59 time: 0.1269 data: 0.0347 max mem: 1600 Test: [2920/5123] eta: 0:03:58 time: 0.1182 data: 0.0350 max mem: 1600 Test: [2930/5123] eta: 0:03:57 time: 0.1201 data: 0.0342 max mem: 1600 Test: [2940/5123] eta: 0:03:56 time: 0.1243 data: 0.0346 max mem: 1600 Test: [2950/5123] eta: 0:03:56 time: 0.1274 data: 0.0359 max mem: 1600 Test: [2960/5123] eta: 0:03:54 time: 0.1222 data: 0.0343 max mem: 1600 Test: [2970/5123] eta: 0:03:53 time: 0.1136 data: 0.0322 max mem: 1600 Test: [2980/5123] eta: 0:03:52 time: 0.1104 data: 0.0310 max mem: 1600 Test: [2990/5123] eta: 0:03:51 time: 0.1174 data: 0.0307 max mem: 1602 Test: [3000/5123] eta: 0:03:50 time: 0.1170 data: 0.0295 max mem: 1602 Test: [3010/5123] eta: 0:03:49 time: 0.1120 data: 0.0291 max mem: 1602 Test: [3020/5123] eta: 0:03:48 time: 0.1190 data: 0.0304 max mem: 1602 Test: [3030/5123] eta: 0:03:47 time: 0.1162 data: 0.0295 max mem: 1602 Test: [3040/5123] eta: 0:03:46 time: 0.1232 data: 0.0333 max mem: 1602 Test: [3050/5123] eta: 0:03:45 time: 0.1245 data: 0.0342 max mem: 1602 Test: [3060/5123] eta: 0:03:44 time: 0.1166 data: 0.0302 max mem: 1602 Test: [3070/5123] eta: 0:03:43 time: 0.1214 data: 0.0324 max mem: 1602 Test: [3080/5123] eta: 0:03:42 time: 0.1186 data: 0.0323 max mem: 1602 Test: [3090/5123] eta: 0:03:41 time: 0.1235 data: 0.0323 max mem: 1602 Test: [3100/5123] eta: 0:03:40 time: 0.1364 data: 0.0356 max mem: 1602 Test: [3110/5123] eta: 0:03:39 time: 0.1269 data: 0.0342 max mem: 1602 Test: [3120/5123] eta: 0:03:38 time: 0.1162 data: 0.0326 max mem: 1602 Test: [3130/5123] eta: 0:03:37 time: 0.1144 data: 0.0327 max mem: 1602 Test: [3140/5123] eta: 0:03:36 time: 0.1171 data: 0.0323 max mem: 1602 Test: [3150/5123] eta: 0:03:35 time: 0.1207 data: 0.0320 max mem: 1602 Test: [3160/5123] eta: 0:03:34 time: 0.1137 data: 0.0305 max mem: 1602 Test: [3170/5123] eta: 0:03:33 time: 0.1136 data: 0.0325 max mem: 1602 Test: [3180/5123] eta: 0:03:32 time: 0.1156 data: 0.0357 max mem: 1602 Test: [3190/5123] eta: 0:03:31 time: 0.1208 data: 0.0363 max mem: 1602 Test: [3200/5123] eta: 0:03:30 time: 0.1322 data: 0.0392 max mem: 1602 Test: [3210/5123] eta: 0:03:29 time: 0.1313 data: 0.0359 max mem: 1602 Test: [3220/5123] eta: 0:03:28 time: 0.1214 data: 0.0297 max mem: 1602 Test: [3230/5123] eta: 0:03:27 time: 0.1192 data: 0.0317 max mem: 1602 Test: [3240/5123] eta: 0:03:26 time: 0.1239 data: 0.0332 max mem: 1602 Test: [3250/5123] eta: 0:03:25 time: 0.1207 data: 0.0309 max mem: 1602 Test: [3260/5123] eta: 0:03:24 time: 0.1143 data: 0.0300 max mem: 1602 Test: [3270/5123] eta: 0:03:23 time: 0.1200 data: 0.0318 max mem: 1602 Test: [3280/5123] eta: 0:03:22 time: 0.1237 data: 0.0321 max mem: 1602 Test: [3290/5123] eta: 0:03:21 time: 0.1215 data: 0.0325 max mem: 1602 Test: [3300/5123] eta: 0:03:20 time: 0.1198 data: 0.0325 max mem: 1602 Test: [3310/5123] eta: 0:03:19 time: 0.1252 data: 0.0327 max mem: 1602 Test: [3320/5123] eta: 0:03:18 time: 0.1355 data: 0.0362 max mem: 1602 Test: [3330/5123] eta: 0:03:17 time: 0.1377 data: 0.0378 max mem: 1609 Test: [3340/5123] eta: 0:03:16 time: 0.1269 data: 0.0338 max mem: 1609 Test: [3350/5123] eta: 0:03:15 time: 0.1223 data: 0.0322 max mem: 1609 Test: [3360/5123] eta: 0:03:14 time: 0.1313 data: 0.0343 max mem: 1609 Test: [3370/5123] eta: 0:03:13 time: 0.1346 data: 0.0361 max mem: 1609 Test: [3380/5123] eta: 0:03:12 time: 0.1282 data: 0.0382 max mem: 1609 Test: [3390/5123] eta: 0:03:11 time: 0.1245 data: 0.0378 max mem: 1609 Test: [3400/5123] eta: 0:03:10 time: 0.1211 data: 0.0334 max mem: 1609 Test: [3410/5123] eta: 0:03:09 time: 0.1179 data: 0.0290 max mem: 1609 Test: [3420/5123] eta: 0:03:08 time: 0.1194 data: 0.0294 max mem: 1609 Test: [3430/5123] eta: 0:03:06 time: 0.1191 data: 0.0312 max mem: 1609 Test: [3440/5123] eta: 0:03:05 time: 0.1195 data: 0.0298 max mem: 1609 Test: [3450/5123] eta: 0:03:04 time: 0.1160 data: 0.0277 max mem: 1609 Test: [3460/5123] eta: 0:03:03 time: 0.1113 data: 0.0282 max mem: 1609 Test: [3470/5123] eta: 0:03:02 time: 0.1207 data: 0.0320 max mem: 1609 Test: [3480/5123] eta: 0:03:01 time: 0.1215 data: 0.0324 max mem: 1609 Test: [3490/5123] eta: 0:03:00 time: 0.1130 data: 0.0293 max mem: 1609 Test: [3500/5123] eta: 0:02:59 time: 0.1216 data: 0.0313 max mem: 1609 Test: [3510/5123] eta: 0:02:58 time: 0.1238 data: 0.0314 max mem: 1609 Test: [3520/5123] eta: 0:02:57 time: 0.1122 data: 0.0287 max mem: 1609 Test: [3530/5123] eta: 0:02:56 time: 0.1193 data: 0.0335 max mem: 1609 Test: [3540/5123] eta: 0:02:55 time: 0.1288 data: 0.0339 max mem: 1609 Test: [3550/5123] eta: 0:02:54 time: 0.1280 data: 0.0325 max mem: 1609 Test: [3560/5123] eta: 0:02:53 time: 0.1357 data: 0.0347 max mem: 1609 Test: [3570/5123] eta: 0:02:52 time: 0.1451 data: 0.0381 max mem: 1609 Test: [3580/5123] eta: 0:02:51 time: 0.1352 data: 0.0385 max mem: 1609 Test: [3590/5123] eta: 0:02:50 time: 0.1213 data: 0.0346 max mem: 1609 Test: [3600/5123] eta: 0:02:49 time: 0.1277 data: 0.0326 max mem: 1609 Test: [3610/5123] eta: 0:02:48 time: 0.1305 data: 0.0311 max mem: 1609 Test: [3620/5123] eta: 0:02:47 time: 0.1286 data: 0.0335 max mem: 1609 Test: [3630/5123] eta: 0:02:46 time: 0.1262 data: 0.0331 max mem: 1609 Test: [3640/5123] eta: 0:02:45 time: 0.1432 data: 0.0368 max mem: 1609 Test: [3650/5123] eta: 0:02:44 time: 0.1481 data: 0.0371 max mem: 1609 Test: [3660/5123] eta: 0:02:42 time: 0.1169 data: 0.0290 max mem: 1609 Test: [3670/5123] eta: 0:02:41 time: 0.1068 data: 0.0267 max mem: 1609 Test: [3680/5123] eta: 0:02:40 time: 0.1151 data: 0.0277 max mem: 1609 Test: [3690/5123] eta: 0:02:39 time: 0.1124 data: 0.0285 max mem: 1609 Test: [3700/5123] eta: 0:02:38 time: 0.1160 data: 0.0291 max mem: 1609 Test: [3710/5123] eta: 0:02:37 time: 0.1204 data: 0.0287 max mem: 1609 Test: [3720/5123] eta: 0:02:36 time: 0.1121 data: 0.0291 max mem: 1609 Test: [3730/5123] eta: 0:02:35 time: 0.1223 data: 0.0319 max mem: 1609 Test: [3740/5123] eta: 0:02:34 time: 0.1242 data: 0.0305 max mem: 1609 Test: [3750/5123] eta: 0:02:33 time: 0.1133 data: 0.0292 max mem: 1609 Test: [3760/5123] eta: 0:02:31 time: 0.1148 data: 0.0305 max mem: 1609 Test: [3770/5123] eta: 0:02:31 time: 0.1358 data: 0.0368 max mem: 1609 Test: [3780/5123] eta: 0:02:29 time: 0.1372 data: 0.0386 max mem: 1609 Test: [3790/5123] eta: 0:02:28 time: 0.1258 data: 0.0374 max mem: 1609 Test: [3800/5123] eta: 0:02:27 time: 0.1356 data: 0.0389 max mem: 1609 Test: [3810/5123] eta: 0:02:26 time: 0.1318 data: 0.0345 max mem: 1609 Test: [3820/5123] eta: 0:02:25 time: 0.1210 data: 0.0300 max mem: 1653 Test: [3830/5123] eta: 0:02:24 time: 0.1203 data: 0.0285 max mem: 1653 Test: [3840/5123] eta: 0:02:23 time: 0.1183 data: 0.0287 max mem: 1653 Test: [3850/5123] eta: 0:02:22 time: 0.1096 data: 0.0285 max mem: 1653 Test: [3860/5123] eta: 0:02:21 time: 0.1060 data: 0.0279 max mem: 1653 Test: [3870/5123] eta: 0:02:20 time: 0.1131 data: 0.0284 max mem: 1653 Test: [3880/5123] eta: 0:02:19 time: 0.1218 data: 0.0297 max mem: 1653 Test: [3890/5123] eta: 0:02:17 time: 0.1137 data: 0.0286 max mem: 1653 Test: [3900/5123] eta: 0:02:16 time: 0.1069 data: 0.0274 max mem: 1653 Test: [3910/5123] eta: 0:02:15 time: 0.1072 data: 0.0278 max mem: 1653 Test: [3920/5123] eta: 0:02:14 time: 0.1050 data: 0.0271 max mem: 1653 Test: [3930/5123] eta: 0:02:13 time: 0.1066 data: 0.0269 max mem: 1653 Test: [3940/5123] eta: 0:02:12 time: 0.1109 data: 0.0276 max mem: 1653 Test: [3950/5123] eta: 0:02:11 time: 0.1134 data: 0.0283 max mem: 1653 Test: [3960/5123] eta: 0:02:10 time: 0.1115 data: 0.0286 max mem: 1653 Test: [3970/5123] eta: 0:02:08 time: 0.1106 data: 0.0286 max mem: 1653 Test: [3980/5123] eta: 0:02:07 time: 0.1165 data: 0.0302 max mem: 1653 Test: [3990/5123] eta: 0:02:06 time: 0.1181 data: 0.0313 max mem: 1653 Test: [4000/5123] eta: 0:02:05 time: 0.1129 data: 0.0313 max mem: 1653 Test: [4010/5123] eta: 0:02:04 time: 0.1160 data: 0.0339 max mem: 1653 Test: [4020/5123] eta: 0:02:03 time: 0.1158 data: 0.0319 max mem: 1653 Test: [4030/5123] eta: 0:02:02 time: 0.1140 data: 0.0285 max mem: 1653 Test: [4040/5123] eta: 0:02:01 time: 0.1147 data: 0.0284 max mem: 1653 Test: [4050/5123] eta: 0:02:00 time: 0.1204 data: 0.0299 max mem: 1653 Test: [4060/5123] eta: 0:01:58 time: 0.1217 data: 0.0306 max mem: 1653 Test: [4070/5123] eta: 0:01:57 time: 0.1155 data: 0.0282 max mem: 1653 Test: [4080/5123] eta: 0:01:56 time: 0.1128 data: 0.0288 max mem: 1653 Test: [4090/5123] eta: 0:01:55 time: 0.1083 data: 0.0297 max mem: 1653 Test: [4100/5123] eta: 0:01:54 time: 0.1060 data: 0.0287 max mem: 1653 Test: [4110/5123] eta: 0:01:53 time: 0.1006 data: 0.0267 max mem: 1653 Test: [4120/5123] eta: 0:01:52 time: 0.1040 data: 0.0271 max mem: 1653 Test: [4130/5123] eta: 0:01:51 time: 0.1150 data: 0.0294 max mem: 1653 Test: [4140/5123] eta: 0:01:49 time: 0.1168 data: 0.0294 max mem: 1653 Test: [4150/5123] eta: 0:01:48 time: 0.1220 data: 0.0303 max mem: 1653 Test: [4160/5123] eta: 0:01:47 time: 0.1227 data: 0.0303 max mem: 1653 Test: [4170/5123] eta: 0:01:46 time: 0.1117 data: 0.0284 max mem: 1653 Test: [4180/5123] eta: 0:01:45 time: 0.1112 data: 0.0284 max mem: 1653 Test: [4190/5123] eta: 0:01:44 time: 0.1143 data: 0.0282 max mem: 1653 Test: [4200/5123] eta: 0:01:43 time: 0.1185 data: 0.0284 max mem: 1653 Test: [4210/5123] eta: 0:01:42 time: 0.1259 data: 0.0312 max mem: 1653 Test: [4220/5123] eta: 0:01:41 time: 0.1194 data: 0.0334 max mem: 1653 Test: [4230/5123] eta: 0:01:40 time: 0.1116 data: 0.0309 max mem: 1653 Test: [4240/5123] eta: 0:01:38 time: 0.1060 data: 0.0270 max mem: 1653 Test: [4250/5123] eta: 0:01:37 time: 0.0976 data: 0.0257 max mem: 1653 Test: [4260/5123] eta: 0:01:36 time: 0.1024 data: 0.0265 max mem: 1653 Test: [4270/5123] eta: 0:01:35 time: 0.1126 data: 0.0279 max mem: 1653 Test: [4280/5123] eta: 0:01:34 time: 0.1151 data: 0.0282 max mem: 1653 Test: [4290/5123] eta: 0:01:33 time: 0.1114 data: 0.0276 max mem: 1653 Test: [4300/5123] eta: 0:01:32 time: 0.1104 data: 0.0262 max mem: 1653 Test: [4310/5123] eta: 0:01:31 time: 0.1104 data: 0.0255 max mem: 1653 Test: [4320/5123] eta: 0:01:29 time: 0.1152 data: 0.0270 max mem: 1653 Test: [4330/5123] eta: 0:01:28 time: 0.1215 data: 0.0293 max mem: 1653 Test: [4340/5123] eta: 0:01:27 time: 0.1203 data: 0.0303 max mem: 1653 Test: [4350/5123] eta: 0:01:26 time: 0.1216 data: 0.0299 max mem: 1653 Test: [4360/5123] eta: 0:01:25 time: 0.1187 data: 0.0292 max mem: 1653 Test: [4370/5123] eta: 0:01:24 time: 0.1171 data: 0.0299 max mem: 1653 Test: [4380/5123] eta: 0:01:23 time: 0.1132 data: 0.0287 max mem: 1653 Test: [4390/5123] eta: 0:01:22 time: 0.1109 data: 0.0282 max mem: 1653 Test: [4400/5123] eta: 0:01:21 time: 0.1161 data: 0.0296 max mem: 1653 Test: [4410/5123] eta: 0:01:19 time: 0.1111 data: 0.0287 max mem: 1653 Test: [4420/5123] eta: 0:01:18 time: 0.1140 data: 0.0304 max mem: 1653 Test: [4430/5123] eta: 0:01:17 time: 0.1166 data: 0.0329 max mem: 1653 Test: [4440/5123] eta: 0:01:16 time: 0.1126 data: 0.0322 max mem: 1653 Test: [4450/5123] eta: 0:01:15 time: 0.1127 data: 0.0294 max mem: 1653 Test: [4460/5123] eta: 0:01:14 time: 0.1141 data: 0.0287 max mem: 1653 Test: [4470/5123] eta: 0:01:13 time: 0.1116 data: 0.0274 max mem: 1653 Test: [4480/5123] eta: 0:01:12 time: 0.1116 data: 0.0274 max mem: 1653 Test: [4490/5123] eta: 0:01:10 time: 0.1142 data: 0.0308 max mem: 1653 Test: [4500/5123] eta: 0:01:09 time: 0.1092 data: 0.0297 max mem: 1653 Test: [4510/5123] eta: 0:01:08 time: 0.1016 data: 0.0267 max mem: 1653 Test: [4520/5123] eta: 0:01:07 time: 0.1039 data: 0.0267 max mem: 1653 Test: [4530/5123] eta: 0:01:06 time: 0.1144 data: 0.0287 max mem: 1653 Test: [4540/5123] eta: 0:01:05 time: 0.1125 data: 0.0287 max mem: 1653 Test: [4550/5123] eta: 0:01:04 time: 0.1038 data: 0.0266 max mem: 1653 Test: [4560/5123] eta: 0:01:03 time: 0.1064 data: 0.0257 max mem: 1718 Test: [4570/5123] eta: 0:01:01 time: 0.1060 data: 0.0255 max mem: 1718 Test: [4580/5123] eta: 0:01:00 time: 0.1039 data: 0.0257 max mem: 1718 Test: [4590/5123] eta: 0:00:59 time: 0.1026 data: 0.0263 max mem: 1718 Test: [4600/5123] eta: 0:00:58 time: 0.1004 data: 0.0271 max mem: 1718 Test: [4610/5123] eta: 0:00:57 time: 0.1067 data: 0.0266 max mem: 1718 Test: [4620/5123] eta: 0:00:56 time: 0.1103 data: 0.0258 max mem: 1718 Test: [4630/5123] eta: 0:00:55 time: 0.1107 data: 0.0275 max mem: 1718 Test: [4640/5123] eta: 0:00:54 time: 0.1142 data: 0.0300 max mem: 1718 Test: [4650/5123] eta: 0:00:52 time: 0.1131 data: 0.0315 max mem: 1718 Test: [4660/5123] eta: 0:00:51 time: 0.1129 data: 0.0319 max mem: 1718 Test: [4670/5123] eta: 0:00:50 time: 0.1161 data: 0.0305 max mem: 1718 Test: [4680/5123] eta: 0:00:49 time: 0.1120 data: 0.0293 max mem: 1718 Test: [4690/5123] eta: 0:00:48 time: 0.1039 data: 0.0273 max mem: 1718 Test: [4700/5123] eta: 0:00:47 time: 0.1004 data: 0.0255 max mem: 1718 Test: [4710/5123] eta: 0:00:46 time: 0.1039 data: 0.0265 max mem: 1718 Test: [4720/5123] eta: 0:00:45 time: 0.1134 data: 0.0291 max mem: 1718 Test: [4730/5123] eta: 0:00:43 time: 0.1203 data: 0.0308 max mem: 1718 Test: [4740/5123] eta: 0:00:42 time: 0.1138 data: 0.0299 max mem: 1718 Test: [4750/5123] eta: 0:00:41 time: 0.1157 data: 0.0291 max mem: 1718 Test: [4760/5123] eta: 0:00:40 time: 0.1129 data: 0.0285 max mem: 1718 Test: [4770/5123] eta: 0:00:39 time: 0.1053 data: 0.0271 max mem: 1718 Test: [4780/5123] eta: 0:00:38 time: 0.1047 data: 0.0267 max mem: 1718 Test: [4790/5123] eta: 0:00:37 time: 0.1131 data: 0.0290 max mem: 1718 Test: [4800/5123] eta: 0:00:36 time: 0.1193 data: 0.0303 max mem: 1718 Test: [4810/5123] eta: 0:00:35 time: 0.1125 data: 0.0287 max mem: 1718 Test: [4820/5123] eta: 0:00:33 time: 0.1169 data: 0.0297 max mem: 1718 Test: [4830/5123] eta: 0:00:32 time: 0.1158 data: 0.0301 max mem: 1718 Test: [4840/5123] eta: 0:00:31 time: 0.1064 data: 0.0285 max mem: 1718 Test: [4850/5123] eta: 0:00:30 time: 0.1004 data: 0.0275 max mem: 1718 Test: [4860/5123] eta: 0:00:29 time: 0.1033 data: 0.0276 max mem: 1718 Test: [4870/5123] eta: 0:00:28 time: 0.1169 data: 0.0326 max mem: 1718 Test: [4880/5123] eta: 0:00:27 time: 0.1186 data: 0.0347 max mem: 1718 Test: [4890/5123] eta: 0:00:26 time: 0.1241 data: 0.0336 max mem: 1718 Test: [4900/5123] eta: 0:00:24 time: 0.1262 data: 0.0320 max mem: 1718 Test: [4910/5123] eta: 0:00:23 time: 0.1181 data: 0.0297 max mem: 1718 Test: [4920/5123] eta: 0:00:22 time: 0.1144 data: 0.0291 max mem: 1718 Test: [4930/5123] eta: 0:00:21 time: 0.1095 data: 0.0272 max mem: 1718 Test: [4940/5123] eta: 0:00:20 time: 0.1068 data: 0.0258 max mem: 1718 Test: [4950/5123] eta: 0:00:19 time: 0.1109 data: 0.0273 max mem: 1718 Test: [4960/5123] eta: 0:00:18 time: 0.1140 data: 0.0280 max mem: 1718 Test: [4970/5123] eta: 0:00:17 time: 0.1098 data: 0.0281 max mem: 1718 Test: [4980/5123] eta: 0:00:16 time: 0.1068 data: 0.0277 max mem: 1718 Test: [4990/5123] eta: 0:00:14 time: 0.1065 data: 0.0271 max mem: 1718 Test: [5000/5123] eta: 0:00:13 time: 0.1161 data: 0.0304 max mem: 1718 Test: [5010/5123] eta: 0:00:12 time: 0.1247 data: 0.0326 max mem: 1718 Test: [5020/5123] eta: 0:00:11 time: 0.1259 data: 0.0312 max mem: 1718 Test: [5030/5123] eta: 0:00:10 time: 0.1186 data: 0.0302 max mem: 1718 Test: [5040/5123] eta: 0:00:09 time: 0.1147 data: 0.0313 max mem: 1718 Test: [5050/5123] eta: 0:00:08 time: 0.1176 data: 0.0317 max mem: 1718 Test: [5060/5123] eta: 0:00:07 time: 0.1199 data: 0.0359 max mem: 1718 Test: [5070/5123] eta: 0:00:05 time: 0.1168 data: 0.0343 max mem: 1718 Test: [5080/5123] eta: 0:00:04 time: 0.1068 data: 0.0273 max mem: 1718 Test: [5090/5123] eta: 0:00:03 time: 0.1010 data: 0.0280 max mem: 1718 Test: [5100/5123] eta: 0:00:02 time: 0.1094 data: 0.0313 max mem: 1718 Test: [5110/5123] eta: 0:00:01 time: 0.1148 data: 0.0327 max mem: 1718 Test: [5120/5123] eta: 0:00:00 time: 0.1242 data: 0.0318 max mem: 1718 Test: [5122/5123] eta: 0:00:00 time: 0.1219 data: 0.0304 max mem: 1718 Test: Total time: 0:09:34 (0.1121 s / it) aeroplane has 21211 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') bicycle has 392 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') bird has 74 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') boat has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') bottle has 10 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') bus has 271 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') car has 27257 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') cat has 18289 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') chair has 10353 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') cow has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') diningtable has 9622 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') dog has 13194 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') horse has 55 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') motorbike has 8782 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') person has 261116 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') pottedplant has 2485 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') sheep has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') sofa has 11180 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') train has 11493 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') tvmonitor has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') truck has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') traffic light has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') fire hydrant has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') stop sign has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') parking meter has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') bench has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') elephant has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') bear has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') zebra has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') giraffe has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') backpack has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') umbrella has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') handbag has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') tie has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') suitcase has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') microwave has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') oven has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') toaster has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') sink has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') refrigerator has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') frisbee has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') skis has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') snowboard has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') sports ball has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') kite has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') baseball bat has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') baseball glove has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') skateboard has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') surfboard has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') tennis racket has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') banana has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') apple has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') sandwich has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') orange has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') broccoli has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') carrot has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') hot dog has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') pizza has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') donut has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') cake has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') bed has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') toilet has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') laptop has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') mouse has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') remote has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') keyboard has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') cell phone has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') book has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') clock has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') vase has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') scissors has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') teddy bear has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') hair drier has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') toothbrush has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') wine glass has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') cup has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') fork has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') knife has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') spoon has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') bowl has 0 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') unknown has 628816 predictions. self.known_classes = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car', 'cat', 'chair', 'cow', 'diningtable', 'dog', 'horse', 'motorbike', 'person', 'pottedplant', 'sheep', 'sofa', 'train', 'tvmonitor') detection mAP50: 0.139154 detection mAP: 0.139154 ---AP50--- Wilderness Impact: {0.1: {50: 0.04175074215324804}, 0.2: {50: 0.0333615213671852}, 0.3: {50: 0.031726434946267534}, 0.4: {50: 0.032623117946300954}, 0.5: {50: 0.032623117946300954}, 0.6: {50: 0.032623117946300954}, 0.7: {50: 0.032623117946300954}, 0.8: {50: 0.032623117946300954}, 0.9: {50: 0.032623117946300954}} avg_precision: {0.1: {50: 0.00240149341379689}, 0.2: {50: 0.00240149341379689}, 0.3: {50: 0.00240149341379689}, 0.4: {50: 0.00240149341379689}, 0.5: {50: 0.00240149341379689}, 0.6: {50: 0.00240149341379689}, 0.7: {50: 0.00240149341379689}, 0.8: {50: 0.00240149341379689}, 0.9: {50: 0.00240149341379689}} Absolute OSE (total_num_unk_det_as_known): {50: 13054.0} total_num_unk 23320 AP50: ['0.6', '0.0', '0.3', '0.0', '2.3', '0.5', '0.7', '1.2', '0.1', '0.0', '0.9', '0.5', '0.2', '0.2', '2.0', '0.3', '0.0', '0.2', '1.2', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0'] Precisions50: ['0.6', '0.0', '1.4', '0.0', '10.0', '1.5', '1.2', '1.0', '0.5', '0.0', '1.4', '1.0', '1.8', '0.8', '1.5', '0.8', '0.0', '0.8', '1.1', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.2'] Recall50: ['36.1', '0.0', '0.1', '0.0', '0.0', '1.0', '10.5', '35.1', '1.6', '0.0', '9.8', '19.7', '0.2', '13.3', '22.8', '2.0', '0.0', '17.6', '33.4', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '0.0', '6.5'] Current class AP50: tensor(0.5625) Current class Precisions50: 1.2671957440714505 Current class Recall50: 10.170403362244912 Known AP50: tensor(0.5625) Known Precisions50: 1.2671957440714505 Known Recall50: 10.170403362244912 Unknown AP50: tensor(0.0218) Unknown Precisions50: 0.23997480980127733 Unknown Recall50: 6.470840480274442 aeroplane 0.637674 bicycle 0.000000 bird 0.324675 boat 0.000000 bottle 2.272727 bus 0.505050 car 0.706116 cat 1.228560 chair 0.126850 cow 0.000000 diningtable 0.909091 dog 0.467677 horse 0.245700 motorbike 0.243274 person 1.958900 pottedplant 0.284091 sheep 0.000000 sofa 0.186913 train 1.152365 tvmonitor 0.000000 truck 0.000000 traffic light 0.000000 fire hydrant 0.000000 stop sign 0.000000 parking meter 0.000000 bench 0.000000 elephant 0.000000 bear 0.000000 zebra 0.000000 giraffe 0.000000 backpack 0.000000 umbrella 0.000000 handbag 0.000000 tie 0.000000 suitcase 0.000000 microwave 0.000000 oven 0.000000 toaster 0.000000 sink 0.000000 refrigerator 0.000000 frisbee 0.000000 skis 0.000000 snowboard 0.000000 sports ball 0.000000 kite 0.000000 baseball bat 0.000000 baseball glove 0.000000 skateboard 0.000000 surfboard 0.000000 tennis racket 0.000000 banana 0.000000 apple 0.000000 sandwich 0.000000 orange 0.000000 broccoli 0.000000 carrot 0.000000 hot dog 0.000000 pizza 0.000000 donut 0.000000 cake 0.000000 bed 0.000000 toilet 0.000000 laptop 0.000000 mouse 0.000000 remote 0.000000 keyboard 0.000000 cell phone 0.000000 book 0.000000 clock 0.000000 vase 0.000000 scissors 0.000000 teddy bear 0.000000 hair drier 0.000000 toothbrush 0.000000 wine glass 0.000000 cup 0.000000 fork 0.000000 knife 0.000000 spoon 0.000000 bowl 0.000000 unknown 0.021832

I have trained the first task and used the original OWOD test set for testing. I cannot understand why some categories have an AP exceeding 1. I feel that there seems to be an issue with the eval code here. Although the WI has also decreased, the AOSE is very high, and the mAP of known classes also looks abnormal. I don't understand what caused it?

lzylzylzy123456 commented 6 months ago

In OW-DETR, the authors declared that they used the replay mathod to prevent the model from catastrophically forgetting what it have learnt, but I cannot find the corresponding codes in their given codes? Do you know how they realized that?

The original paper states, 'For finetuning during incremental step, the learning rate is reduced by a factor of 10 and trained using a set of 50 stored exemplars per known class.' I also couldn't find the related code. I believe the original text simply used tx_ft for fine-tuning based on sample replay, as I also didn't see any related code in OWOD.

issue5

It seems to be consistent with my guess

djhandsome commented 6 months ago

ValueError: Expected value argument (Tensor of shape (100,)) to be within the support (GreaterThan(lower_bound=0.0)) of the distribution Weibull(scale: 1.2541329860687256, concentration: 1.6065914630889893), but found invalid values: tensor([-0.4707, -0.4606, -0.4505, -0.4404, -0.4303, -0.4202, -0.4101, -0.4000, -0.3899, -0.3798, -0.3697, -0.3596, -0.3495, -0.3394, -0.3293, -0.3192, -0.3091, -0.2990, -0.2889, -0.2788, -0.2687, -0.2586, -0.2485, -0.2384, -0.2283, -0.2182, -0.2081, -0.1980, -0.1879, -0.1778, -0.1677, -0.1576, -0.1475, -0.1374, -0.1273, -0.1172, -0.1071, -0.0970, -0.0869, -0.0768, -0.0667, -0.0566, -0.0465, -0.0364, -0.0263, -0.0162, -0.0061, 0.0040, 0.0141, 0.0242, 0.0343, 0.0444, 0.0545, 0.0646, 0.0747, 0.0848, 0.0949, 0.1050, 0.1151, 0.1252, 0.1353, 0.1454, 0.1555, 0.1656, 0.1757, 0.1858, 0.1959, 0.2060, 0.2161, 0.2262, 0.2363, 0.2464, 0.2565, 0.2666, 0.2767, 0.2868, 0.2969, 0.3070, 0.3172, 0.3273, 0.3374, 0.3475, 0.3576, 0.3677, 0.3778, 0.3879, 0.3980, 0.4081, 0.4182, 0.4283, 0.4384, 0.4485, 0.4586, 0.4687, 0.4788, 0.4889, 0.4990, 0.5091, 0.5192, 0.5293])

djhandsome commented 6 months ago

Thank you for your work. I saw that you successfully reproduced owod, and I encountered a problem during the process of reproducing owod Task 1. Have you also encountered it?

Honey-ashead commented 6 months ago

Thank you for your work. I saw that you successfully reproduced owod, and I encountered a problem during the process of reproducing owod Task 1. Have you also encountered it?

It seems that there is something wrong with the calculation of lse. lse = temp * torch.logsumexp(logits[:, :num_seen_classes] / temp, dim=1), lse should all greater than 0.

djhandsome commented 6 months ago

Thank you for your reply. I have resolved the issue with OWOD, but I encountered an issue running copy2voc.py while reproducing the OWDETR code. FileNotFoundError: [Errno 2] No such file or directory: 'data/VOC2007/ImageSets/Main/train.txt'

lzylzylzy123456 commented 6 months ago

Thank you for your work. I saw that you successfully reproduced owod, and I encountered a problem during the process of reproducing owod Task 1. Have you also encountered it?

May I ask how you can solve it

djhandsome commented 6 months ago

I think it's a problem with the PyTorch version because I can run it normally after lowering the version.

lzylzylzy123456 commented 6 months ago

I think it's a problem with the PyTorch version because I can run it normally after lowering the version.

May I ask which version of pytorch you are using? I will not encounter the issue of calculating the Weibull distribution when using torch2.0.1 on OWOD, but the same torch2.0.1 will encounter this issue on OW-DETR. However, the code for calculating the Weibull distribution in these two versions is actually the same, which is strange.

Honey-ashead commented 6 months ago

The version of pytorch I am using is also 2.0.1. But I also cannot reproduce OW_DETR as well.

Honey-ashead commented 6 months ago

It seems that I met the same issue as mensioned above, in the file datasets/torchvision_datasets/open_world.py, there is a method of the class OWDetection called remove_prev_class_and_unk_instances, which has some potential problems. For example, an image may contain only instances that do not belong to any valid classes, then all of the instances would be removed and this method would return no annotations, which raises a series of errors in the following codes since the shape of bboxes of the annotations in the processed images is torch.Size([0]).

Follow @lzylzylzy123456 , I have successfuly reproduced OWOD, but I still have difficulty reproducing OW-DETR due to the above issues.

daijiu9 commented 6 months ago

I'm sorry for only seeing the reply. I am using PyTorch version 1.7 and I am unable to successfully reproduce OWDETR. I would like to ask if you directly ran run.sh when reproducing it? And I can't find this file 'configs/OWOD' New Split. sh '.

lobbyd commented 6 months ago

Caught IndexError in DataLoader worker process 1. Original Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop data = fetcher.fetch(index) File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/code/OW-DETR/datasets/torchvision_datasets/open_world.py", line 239, in getitem img, target = self.transforms[-1](img, target) File "/code/OW-DETR/datasets/transforms.py", line 276, in call image, target = t(image, target) File "/code/OW-DETR/datasets/transforms.py", line 196, in call return hflip(img, target) File "/code/OW-DETR/datasets/transforms.py", line 76, in hflip boxes = boxes[:, [2, 1, 0, 3]] torch.as_tensor([-1, 1, -1, 1]) + torch.as_tensor([w, 0, w, 0]) IndexError: too many indices for tensor of dimension 1 File "/code/OW-DETR/datasets/data_prefetcher.py", line 25, in preload self.next_samples, self.next_targets = next(self.loader) File "/code/OW-DETR/datasets/data_prefetcher.py", line 62, in next self.preload() File "/code/OW-DETR/engine.py", line 41, in train_one_epoch samples, targets = prefetcher.next() File "/code/OW-DETR/main_open_world.py", line 288, in main train_stats = train_one_epoch( File "/code/OW-DETR/main_open_world.py", line 365, in main(args) IndexError: Caught IndexError in DataLoader worker process 1. Original Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop data = fetcher.fetch(index) File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/code/OW-DETR/datasets/torchvision_datasets/open_world.py", line 239, in getitem img, target = self.transforms[-1](img, target) File "/code/OW-DETR/datasets/transforms.py", line 276, in call image, target = t(image, target) File "/code/OW-DETR/datasets/transforms.py", line 196, in call return hflip(img, target) File "/code/OW-DETR/datasets/transforms.py", line 76, in hflip boxes = boxes[:, [2, 1, 0, 3]] torch.as_tensor([-1, 1, -1, 1]) + torch.as_tensor([w, 0, w, 0]) IndexError: too many indices for tensor of dimension 1

when I run the procedure to train the model, this bug I don't know how to fix. I want to know it's dataset question or procedure bug.

Honey-ashead commented 6 months ago

I guess it's a problem with dataset partitioning,which may result in removing all the annotations of one image.

CcccccaptionOvO commented 5 months ago

我想使用 OW-Detr 来运行我自己的数据集,我想知道它是否可以实现? 请问你实现了用OW-DETR运行自己的数据集了吗

yangengineering commented 2 months ago

Caught IndexError in DataLoader worker process 1. Original Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop data = fetcher.fetch(index) File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/code/OW-DETR/datasets/torchvision_datasets/open_world.py", line 239, in getitem img, target = self.transforms[-1](img, target) File "/code/OW-DETR/datasets/transforms.py", line 276, in call image, target = t(image, target) File "/code/OW-DETR/datasets/transforms.py", line 196, in call return hflip(img, target) File "/code/OW-DETR/datasets/transforms.py", line 76, in hflip boxes = boxes[:, [2, 1, 0, 3]] torch.as_tensor([-1, 1, -1, 1]) + torch.as_tensor([w, 0, w, 0]) IndexError: too many indices for tensor of dimension 1 File "/code/OW-DETR/datasets/data_prefetcher.py", line 25, in preload self.next_samples, self.next_targets = next(self.loader) File "/code/OW-DETR/datasets/data_prefetcher.py", line 62, in next self.preload() File "/code/OW-DETR/engine.py", line 41, in train_one_epoch samples, targets = prefetcher.next() File "/code/OW-DETR/main_open_world.py", line 288, in main train_stats = train_one_epoch( File "/code/OW-DETR/main_open_world.py", line 365, in main(args) IndexError: Caught IndexError in DataLoader worker process 1. Original Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop data = fetcher.fetch(index) File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/code/OW-DETR/datasets/torchvision_datasets/open_world.py", line 239, in getitem img, target = self.transforms[-1](img, target) File "/code/OW-DETR/datasets/transforms.py", line 276, in call image, target = t(image, target) File "/code/OW-DETR/datasets/transforms.py", line 196, in call return hflip(img, target) File "/code/OW-DETR/datasets/transforms.py", line 76, in hflip boxes = boxes[:, [2, 1, 0, 3]] torch.as_tensor([-1, 1, -1, 1]) + torch.as_tensor([w, 0, w, 0]) IndexError: too many indices for tensor of dimension 1

when I run the procedure to train the model, this bug I don't know how to fix. I want to know it's dataset question or procedure bug.

I also meet this quesiton,do you solve it?

ss880426 commented 2 months ago

Caught IndexError in DataLoader worker process 1. Original Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop data = fetcher.fetch(index) File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/code/OW-DETR/datasets/torchvision_datasets/open_world.py", line 239, in getitem img, target = self.transforms[-1](img, target) File "/code/OW-DETR/datasets/transforms.py", line 276, in call image, target = t(image, target) File "/code/OW-DETR/datasets/transforms.py", line 196, in call return hflip(img, target) File "/code/OW-DETR/datasets/transforms.py", line 76, in hflip boxes = boxes[:, [2, 1, 0, 3]] torch.as_tensor([-1, 1, -1, 1]) + torch.as_tensor([w, 0, w, 0]) IndexError: too many indices for tensor of dimension 1 File "/code/OW-DETR/datasets/data_prefetcher.py", line 25, in preload self.next_samples, self.next_targets = next(self.loader) File "/code/OW-DETR/datasets/data_prefetcher.py", line 62, in next self.preload() File "/code/OW-DETR/engine.py", line 41, in train_one_epoch samples, targets = prefetcher.next() File "/code/OW-DETR/main_open_world.py", line 288, in main train_stats = train_one_epoch( File "/code/OW-DETR/main_open_world.py", line 365, in main(args) IndexError: Caught IndexError in DataLoader worker process 1. Original Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop data = fetcher.fetch(index) File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/opt/conda/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/code/OW-DETR/datasets/torchvision_datasets/open_world.py", line 239, in getitem img, target = self.transforms[-1](img, target) File "/code/OW-DETR/datasets/transforms.py", line 276, in call image, target = t(image, target) File "/code/OW-DETR/datasets/transforms.py", line 196, in call return hflip(img, target) File "/code/OW-DETR/datasets/transforms.py", line 76, in hflip boxes = boxes[:, [2, 1, 0, 3]] torch.as_tensor([-1, 1, -1, 1]) + torch.as_tensor([w, 0, w, 0]) IndexError: too many indices for tensor of dimension 1 when I run the procedure to train the model, this bug I don't know how to fix. I want to know it's dataset question or procedure bug.

I also meet this quesiton,do you solve it?

if "boxes" in target to if "boxes" in target and target["boxes"].shape[0] != 0:

yangengineering commented 2 months ago

thank you, I have writen the code as you tell me, but it still have a problem

Traceback (most recent call last): File "/home/openworld/OW-DETR/main_open_world.py", line 358, in main(args) File "/home/openworld/OW-DETR/main_open_world.py", line 281, in main model, criterion, data_loader_train, optimizer, device, epoch, args.nc_epoch, args.clip_max_norm) File "/home/openworld/OW-DETR/engine.py", line 44, in train_one_epoch loss_dict = criterion(samples, outputs, targets, epoch) ## samples variable needed for feature selection File "/opt/conda/envs/owdetr/lib/python3.7/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(*input, *kwargs) File "/home/openworld/OW-DETR/models/deformable_detr.py", line 465, in forward indices = self.matcher(outputs_without_aux, targets) File "/opt/conda/envs/owdetr/lib/python3.7/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(input, **kwargs) File "/home/openworld/OW-DETR/models/matcher.py", line 84, in forward cost_bbox = torch.cdist(out_bbox, tgt_bbox, p=1) File "/opt/conda/envs/owdetr/lib/python3.7/site-packages/torch/functional.py", line 1119, in cdist return _VF.cdist(x1, x2, p, None) # type: ignore RuntimeError: cdist only supports at least 2D tensors, X2 got: 1D

did you meet this problem? @ss880426

ss880426 commented 2 months ago

thank you, I have writen the code as you tell me, but it still have a problem

Traceback (most recent call last): File "/home/openworld/OW-DETR/main_open_world.py", line 358, in main(args) File "/home/openworld/OW-DETR/main_open_world.py", line 281, in main model, criterion, data_loader_train, optimizer, device, epoch, args.nc_epoch, args.clip_max_norm) File "/home/openworld/OW-DETR/engine.py", line 44, in train_one_epoch loss_dict = criterion(samples, outputs, targets, epoch) ## samples variable needed for feature selection File "/opt/conda/envs/owdetr/lib/python3.7/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(*input, *kwargs) File "/home/openworld/OW-DETR/models/deformable_detr.py", line 465, in forward indices = self.matcher(outputs_without_aux, targets) File "/opt/conda/envs/owdetr/lib/python3.7/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(input, **kwargs) File "/home/openworld/OW-DETR/models/matcher.py", line 84, in forward cost_bbox = torch.cdist(out_bbox, tgt_bbox, p=1) File "/opt/conda/envs/owdetr/lib/python3.7/site-packages/torch/functional.py", line 1119, in cdist return _VF.cdist(x1, x2, p, None) # type: ignore RuntimeError: cdist only supports at least 2D tensors, X2 got: 1D

did you meet this problem? @ss880426

This means that a certain picture in your profile does not have a bbox, because the bboxes of certain classes will be deleted during training. You can try increasing batch_size.

jeantirole commented 2 months ago

i encounted with the same issue, but batch size up has a limitation with gpu ram ,, i only use batch_size smaller than 4 ,, i still got a same problem during training

yangengineering commented 1 month ago

I have increase the bathc_size=4,this problem has been soloved, but why I run the task 1, the gpu ram still increases until it over the largest ram. At the same time, I found the mutil-process code can't run at all, it makes me angry.

zengqing1218 commented 1 month ago

Have you encountered the following problem when training with personal datasets: Traceback (most recent call last): File "main_open_world.py", line 365, in main(args) File "main_open_world.py", line 288, in main train_stats = train_one_epoch( File "/media/ll/02CA5FA6CA5F952F/WZQ_new/OW-DETR-main/engine.py", line 41, in train_one_epoch samples, targets = prefetcher.next() File "/media/ll/02CA5FA6CA5F952F/WZQ_new/OW-DETR-main/datasets/data_prefetcher.py", line 62, in next self.preload() File "/media/ll/02CA5FA6CA5F952F/WZQ_new/OW-DETR-main/datasets/data_prefetcher.py", line 25, in preload self.next_samples, self.next_targets = next(self.loader) File "/home/ll/anaconda3/envs/owdetr/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 681, in next data = self._next_data() File "/home/ll/anaconda3/envs/owdetr/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 1356, in _next_data return self._process_data(data) File "/home/ll/anaconda3/envs/owdetr/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 1402, in _process_data data.reraise() File "/home/ll/anaconda3/envs/owdetr/lib/python3.8/site-packages/torch/_utils.py", line 461, in reraise raise exception IndexError: Caught IndexError in DataLoader worker process 1. Original Traceback (most recent call last): File "/home/ll/anaconda3/envs/owdetr/lib/python3.8/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop data = fetcher.fetch(index) File "/home/ll/anaconda3/envs/owdetr/lib/python3.8/site-packages/torch/utils/data/_utils/fetch.py", line 49, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/ll/anaconda3/envs/owdetr/lib/python3.8/site-packages/torch/utils/data/_utils/fetch.py", line 49, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/media/ll/02CA5FA6CA5F952F/WZQ_new/OW-DETR-main/datasets/torchvision_datasets/open_world.py", line 405, in getitem img, target = self.transforms[-1](img, target) File "/media/ll/02CA5FA6CA5F952F/WZQ_new/OW-DETR-main/datasets/transforms.py", line 402, in call image, target = t(image, target) File "/media/ll/02CA5FA6CA5F952F/WZQ_new/OW-DETR-main/datasets/transforms.py", line 322, in call return hflip(img, target) File "/media/ll/02CA5FA6CA5F952F/WZQ_new/OW-DETR-main/datasets/transforms.py", line 142, in hflip boxes = boxes[:, [2, 1, 0, 3]] * torch.as_tensor([-1, 1, -1, 1]) + torch.as_tensor([w, 0, w, 0]) IndexError: too many indices for tensor of dimension 1

yangengineering commented 3 weeks ago

you can increase the batchsize more than 2, it is better to four.