qiank10 / MVDNet

Robust Multimodal Vehicle Detection in Foggy Weather Using Complementary Lidar and Radar Signals, CVPR 2021.
Apache License 2.0
102 stars 16 forks source link

Where to put oxford data #13

Open MaiRajborirug opened 1 year ago

MaiRajborirug commented 1 year ago

I wonder where can I specify the Oxford Robotcar DATAPATH during training and validation. python ../tools/train.py --config ../configs/train_config.yaml python ../tools/eval.py --config ../configs/train_config.yaml

Do I need to put processed data (data: 10/01/2019, time: 11:46:21 GMT) in this configuration?

|-- MVDNet/ RobotCar/ object
      |-- label_2d
      |-- label_3d
      |-- radar
      |-- radar_history
      |-- lidar
      |-- lidar_history
      |-- lidar_fog_0.05               # Foggy lidar data with fog density as 0.05
      |-- lidar_history_fog_0.05

I tried it, but here is the error message

[03/17 12:33:42 d2.data.build]: Removed 367 images with no usable annotations. 6697 images left.
[03/17 12:33:42 d2.data.build]: Distribution of instances among all 1 categories:
|  category  | #instances   |
|:----------:|:-------------|
|    car     | 42928        |
|            |              |
[03/17 12:33:42 d2.data.common]: Serializing 6697 elements to byte tensors and concatenating them all ...
[03/17 12:33:42 d2.data.common]: Serialized dataset takes 4.72 MiB
[03/17 12:33:42 d2.data.build]: Using training sampler TrainingSampler
[03/17 12:33:42 fvcore.common.checkpoint]: No checkpoint found. Initializing model from scratch
[03/17 12:33:42 d2.engine.train_loop]: Starting training from iteration 0

ERROR [03/17 12:33:42 d2.engine.train_loop]: Exception during training:
Traceback (most recent call last):
  File "/home/ubuntu/Documents/github/detectron2/detectron2/engine/train_loop.py", line 132, in train
    self.run_step()
  File "/home/ubuntu/Documents/github/detectron2/detectron2/engine/train_loop.py", line 209, in run_step
    data = next(self._data_loader_iter)
  File "/home/ubuntu/Documents/github/detectron2/detectron2/data/common.py", line 140, in __iter__
    for d in self.dataset:
  File "/home/ubuntu/miniconda3/envs/detectron2/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 521, in __next__
    data = self._next_data()
  File "/home/ubuntu/miniconda3/envs/detectron2/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1203, in _next_data
    return self._process_data(data)
  File "/home/ubuntu/miniconda3/envs/detectron2/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1229, in _process_data
    data.reraise()
  File "/home/ubuntu/miniconda3/envs/detectron2/lib/python3.7/site-packages/torch/_utils.py", line 425, in reraise
    raise self.exc_type(msg)
FileNotFoundError: Caught FileNotFoundError in DataLoader worker process 0.
Original Traceback (most recent call last):
  File "/home/ubuntu/miniconda3/envs/detectron2/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop
    data = fetcher.fetch(index)
  File "/home/ubuntu/miniconda3/envs/detectron2/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/ubuntu/miniconda3/envs/detectron2/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/ubuntu/Documents/github/detectron2/detectron2/data/common.py", line 41, in __getitem__
    data = self._map_func(self._dataset[cur_idx])
  File "/home/ubuntu/Documents/github/detectron2/detectron2/utils/serialize.py", line 23, in __call__
    return self._obj(*args, **kwargs)
  File "/home/ubuntu/Documents/github/MVDNet/mvdnet/data/robotcar_mapper.py", line 94, in __call__
    lidar_history_data = np.fromfile(lidar_history_name, dtype=np.float32)
FileNotFoundError: [Errno 2] No such file or directory: './data/RobotCar/object/lidar_history/1547122069632436_1.bin'

However, I have ./data/RobotCar/object/lidar_history/1547122069632436_1.bin in the folder

image

My setting is ubuntu=20.04, python=3.7.16, detection=0.1.1, torch=1.9.1+cu111,

Appreciate any help and suggestion Bests regards

ykw3333 commented 1 year ago

hi,how 's it going?

AkashSrinivasulu commented 1 year ago

Hi @MaiRajborirug, were you able to train the MVDNet? If so, can you please help me?

ykw3333 commented 1 year ago

Hi @MaiRajborirug, were you able to train the MVDNet? If so, can you please help me?

sry,i can't train it either.QAQ

MaiRajborirug commented 1 year ago

Hi @AkashSrinivasulu @ykw3333 , I found that the files xxxx_1.bin - xxxx_4.bin are not actual bin files. They are the symlink to the .bin files in data/RobotCar/object/lidar/.... The author did it because symlinks required only ~60 bytes but the actual .bin file cost ~1.3 MB. Also, my NumPy setting cannot read symlink. It can only read the actual .bin files. Thus, I did two things:

  1. Correct my symlink's target. When we move the symlink files, their targets stay the same. It depends on where you keep the lidar/ and lidar_history folders. Since I move the lidar files from folder/processed/lidar/xxxx.bin to /data/RobotCar/object/lidar/xxxx.bin. I need to adjust my symlinks targets accordingly. Here is my code for it.
# Step1: adjust symlink
import os, errno
import numpy as np

root_path = "../ORR_dataset/processed"
hist_path = os.path.join(root_path, 'lidar_history') # from this file to lidar_history
edit_path = './data/RobotCar/object/lidar'  # from MVDNet to lidar

def symlink_force(target, link_name):
    try:
        os.symlink(target, link_name)
    except OSError as e:
        if e.errno == errno.EEXIST:
            os.remove(link_name)
            os.symlink(target, link_name)
        else:
            raise e

list_files = os.listdir(hist_path)
list_links = []
for file in list_files:
    if file[-5] != 'T': # exclude the actual .bin files
        src_path = os.readlink(os.path.join(hist_path, file))
        src_file = src_path.split('/')[-1]
        src_edit_path = os.path.join(edit_path, src_file)
        symlink_force(src_edit_path, os.path.join(hist_path, file))

To check If your symlinks have correct targets. You can use ls -al ../data/RobotCar/object/lidar_history on terminal. My results change from

lrwxrwxrwx 1 ubuntu ubuntu      49 Mar 24 12:45 1547120788638924_1.bin -> ./ORR_datasets/processed/lidar/1547120788388907.bin
-rw-rw-r-- 1 ubuntu ubuntu      64 Mar 17 11:04 1547120788638924_1_T.bin

to

lrwxrwxrwx 1 ubuntu ubuntu      49 Mar 24 12:45 1547120788638924_1.bin -> ./data/RobotCar/object/lidar/1547120788388907.bin
-rw-rw-r-- 1 ubuntu ubuntu      64 Mar 17 11:04 1547120788638924_1_T.bin
  1. My numpy setting can't read the symlink. I need to adjust the MVDNet file MVDNet/mvdnet/data/robotcar_mapper.py from
# robotcar_mapper.py file
...
lidar_history_data = np.fromfile(lidar_history_name, dtype=np.float32) # line 94
...
lidar_history_data = np.fromfile(lidar_history_name, dtype=np.float32) # line 107

to

# robotcar_mapper.py file
...
lidar_history_data = np.fromfile(os.readlink(lidar_history_name), dtype=np.float32) # line 94
...
lidar_history_data = np.fromfile(os.readlink(lidar_history_name), dtype=np.float32) # line 107

Hope this helps