felipecode / coiltraine

Training framework for conditional imitation learning
MIT License
230 stars 68 forks source link

Error during the training after loading data (AttributeError: 'NoneType' object has no attribute 'swapaxes') #12

Closed fenjiro closed 5 years ago

fenjiro commented 5 years ago

Hello, when i launched the training, and after the script loaded the dataset, i got the following error: (AttributeError: 'NoneType' object has no attribute 'swapaxes'), see the log below. Could you give me some indications.

python coiltraine.py --single-process train -e resnet34imnet --folder baselines --gpus 0 pygame 1.9.4 Hello from the pygame community. https://www.pygame.org/contribute.html self.root_dir /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0 preload Name 50hours_L0 Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00000 Loaded 0.19058333333333333 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00001 Loaded 0.34291666666666665 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00002 Loaded 0.55775 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00003 Loaded 0.7453333333333333 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00004 Loaded 0.8699166666666667 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00005 Loaded 1.0498333333333334 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00006 Loaded 1.2786666666666666 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00007 Loaded 1.4104999999999999 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00008 Loaded 1.538333333333333 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00010 Loaded 1.725833333333333 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00011 Loaded 1.7706666666666664 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00012 Loaded 1.9157499999999996 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00014 Loaded 2.1914999999999996 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00015 Loaded 2.221083333333333 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00016 Loaded 2.366833333333333 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00017 Loaded 2.4814999999999996 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00018 Loaded 2.6587499999999995 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00020 Loaded 2.859083333333333 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00021 Loaded 3.11325 hours of data Episode /media/ucef/90E6A5B7E6A59DCA/COiLTRAiNESampleDatasets/CoILTrain/L0/episode_00022 Loaded 3.2121666666666666 hours of data preload Name 50hours_L0 Loaded dataset Before the loss Traceback (most recent call last): File "/media/ucef/Nouveau nom/CARLA/coiltraine/coil_core/train.py", line 116, in execute for data in data_loader: File "/home/ucef/anaconda3/envs/coiltraine/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 336, in next return self._process_next_batch(batch) File "/home/ucef/anaconda3/envs/coiltraine/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 357, in _process_next_batch raise batch.exc_type(batch.exc_msg) numpy.core._internal.AxisError: Traceback (most recent call last): File "/home/ucef/anaconda3/envs/coiltraine/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 51, in _wrapfunc return getattr(obj, method)(*args, kwds) AttributeError: 'NoneType' object has no attribute 'swapaxes'**

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/ucef/anaconda3/envs/coiltraine/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 106, in _worker_loop samples = collate_fn([dataset[i] for i in batch_indices]) File "/home/ucef/anaconda3/envs/coiltraine/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 106, in samples = collate_fn([dataset[i] for i in batch_indices]) File "/media/ucef/Nouveau nom/CARLA/coiltraine/input/coil_dataset.py", line 112, in getitem img = self.transform(self.batch_read_number boost, img) File "/media/ucef/Nouveau nom/CARLA/coiltraine/input/augmenter.py", line 31, in call img = np.swapaxes(img, 0, 2) File "/home/ucef/anaconda3/envs/coiltraine/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 549, in swapaxes return _wrapfunc(a, 'swapaxes', axis1, axis2) File "/home/ucef/anaconda3/envs/coiltraine/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 61, in _wrapfunc return _wrapit(obj, method, args, *kwds) File "/home/ucef/anaconda3/envs/coiltraine/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 41, in _wrapit result = getattr(asarray(obj), method)(args, **kwds) numpy.core._internal.AxisError: axis1: axis 0 is out of bounds for array of dimension 0

felipecode commented 5 years ago

Try deleting the _preload files. We cache the position of the images and maybe that was wrong. Also make sure the COIL_DATASET_PATH is correct.

fenjiro commented 5 years ago

Hello,

i've deleted the _preload files as mentioned in your feedback, and also checked the COIL_DATASET_PATH, and check the structure of the repository, see below, but i got the same error. Thank you for your support.

echo $COIL_DATASET_PATH /media/ucef/90E6A5B7E6A59DCA/CoiLTRAiNESampleDatasets

CoiLTRAiNESampleDatasets | |--CoILTrain |--CoILVal1 |--CoILVal2

fnozarian commented 5 years ago

I got the same error by running python3 coiltraine.py --folder sample -de TestT1_Town01 -vd CoILVal1 --gpus 0 --docker carlasim/carla:0.8.4:

err_train_log: Traceback (most recent call last): File "/home/farzad/coiltraine/coil_core/train.py", line 119, in execute for data in data_loader: File "/home/farzad/anaconda2/envs/coiltraine/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 336, in next return self._process_next_batch(batch) File "/home/farzad/anaconda2/envs/coiltraine/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 357, in _process_next_batch raise batch.exc_type(batch.exc_msg) numpy.core._internal.AxisError: Traceback (most recent call last): File "/home/farzad/anaconda2/envs/coiltraine/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 51, in _wrapfunc return getattr(obj, method)(*args, kwds) AttributeError: 'NoneType' object has no attribute 'swapaxes'**

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/farzad/anaconda2/envs/coiltraine/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 106, in _worker_loop samples = collate_fn([dataset[i] for i in batch_indices]) File "/home/farzad/anaconda2/envs/coiltraine/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 106, in samples = collate_fn([dataset[i] for i in batch_indices]) File "/home/farzad/coiltraine/input/coil_dataset.py", line 109, in getitem img = self.transform(self.batch_read_number boost, img) File "/home/farzad/coiltraine/input/augmenter.py", line 31, in call img = np.swapaxes(img, 0, 2) File "/home/farzad/anaconda2/envs/coiltraine/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 549, in swapaxes return _wrapfunc(a, 'swapaxes', axis1, axis2) File "/home/farzad/anaconda2/envs/coiltraine/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 61, in _wrapfunc return _wrapit(obj, method, args, *kwds) File "/home/farzad/anaconda2/envs/coiltraine/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 41, in _wrapit result = getattr(asarray(obj), method)(args, **kwds) numpy.core._internal.AxisError: axis1: axis 0 is out of bounds for array of dimension 0

train_log: preload Name 50hours_CoILTrain Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00000 Loaded 0.19058333333333333 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00001 Loaded 0.34291666666666665 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00002 Loaded 0.55775 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00003 Loaded 0.7453333333333333 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00004 Loaded 0.8699166666666667 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00005 Loaded 1.0498333333333334 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00006 Loaded 1.2786666666666666 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00007 Loaded 1.4104999999999999 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00008 Loaded 1.538333333333333 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00010 Loaded 1.725833333333333 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00011 Loaded 1.7706666666666664 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00012 Loaded 1.9157499999999996 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00014 Loaded 2.1914999999999996 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00015 Loaded 2.221083333333333 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00016 Loaded 2.366833333333333 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00017 Loaded 2.4814999999999996 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00018 Loaded 2.6587499999999995 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00020 Loaded 2.859083333333333 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00021 Loaded 3.11325 hours of data Episode /home/farzad/coiltraine/dataset/CoILTrain/episode_00022 Loaded 3.2121666666666666 hours of data preload Name 50hours_CoILTrain Loaded dataset Before the loss

Zeyuli6 commented 5 years ago

Hello, I get the same error. Are you solve this porblem now?

ghost commented 5 years ago

Same problem. Have you found a way to solve it? I receive this error after I disable eager execution mode in Tensorflow 2.0

zxbnjust commented 5 years ago

Hello, I get the same error. Can you tell me the pytorch version you are using? I think this might be the reason.

ghost commented 5 years ago

Hello! The pytorch version I'm using is 1.1.0 Could you please explain how you fixed the problem (if you did)?

пн, 3 июн. 2019 г. в 15:43, zxbnjust notifications@github.com:

Hello, I get the same error. Can you tell me the pytorch version you are using? I think this might be the reason.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/felipecode/coiltraine/issues/12?email_source=notifications&email_token=AHLWO46VGRHJ55EVMLABI7DPYTYTLA5CNFSM4HGRV6AKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWZAOLI#issuecomment-498206509, or mute the thread https://github.com/notifications/unsubscribe-auth/AHLWO44VN4GZAWQBJJIPWFLPYTYTLANCNFSM4HGRV6AA .

zxbnjust commented 5 years ago

Hello! The pytorch version I'm using is 1.1.0 Could you please explain how you fixed the problem (if you did)? пн, 3 июн. 2019 г. в 15:43, zxbnjust notifications@github.com: Hello, I get the same error. Can you tell me the pytorch version you are using? I think this might be the reason. — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#12?email_source=notifications&email_token=AHLWO46VGRHJ55EVMLABI7DPYTYTLA5CNFSM4HGRV6AKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWZAOLI#issuecomment-498206509>, or mute the thread https://github.com/notifications/unsubscribe-auth/AHLWO44VN4GZAWQBJJIPWFLPYTYTLANCNFSM4HGRV6AA .

I don't think it's the reason now. Because I changed the version,but the error still there.

RuihanGao commented 5 years ago

Exactly the same error

RuihanGao commented 5 years ago

Exactly the same error

In my case, I find that it is because of the inconsistency between the image path specified by the sensor_names and the filename in the CoIL dataset. One is expecting LeftAugmentationCameraRGB_xxxx while the actual image name is LeftRGB_xxxxx. Therefore, the images cannot be successfully loaded into the Dataloader.

However, I have not found a way to fix it, so please do let me know if anyone figures it out. Thanks!

RuihanGao commented 5 years ago

Exactly the same error

In my case, I find that it is because of the inconsistency between the image path specified by the sensor_names and the filename in the CoIL dataset. One is expecting LeftAugmentationCameraRGB_xxxx while the actual image name is LeftRGB_xxxxx. Therefore, the images cannot be successfully loaded into the Dataloader.

However, I have not found a way to fix it, so please do let me know if anyone figures it out. Thanks!

Finally, find a way. At around Line 250 in coil_dataset, there are three similar if statements. Changing to something as following helps.

if self.is_measurement_partof_experiment(final_measurement):
    float_dicts.append(final_measurement)
    # rgb = 'CameraRGB_' + data_point_number + '.png'
    rgb = 'CentralRGB_' + data_point_number + '.png'
if self.is_measurement_partof_experiment(final_measurement):
    float_dicts.append(final_measurement)
    # rgb = 'LeftAugmentationCameraRGB_' + data_point_number + '.png'
    rgb = 'LeftRGB_' + data_point_number + '.png'
felipecode commented 5 years ago

Thanks A lot @havefun28 . I was a bit busy and I couldnt fix this. I will do that now.