TRI-ML / packnet-sfm

TRI-ML Monocular Depth Estimation Repository
https://tri-ml.github.io/packnet-sfm/
MIT License
1.24k stars 243 forks source link

tar: /data/datasets: Cannot open: No such file or directory #88

Closed TheRustlessSummer closed 3 years ago

TheRustlessSummer commented 3 years ago

Hello, thanks for your amazing code, but there is a question when I run commend curl -s https://tri-ml-public.s3.amazonaws.com/github/DDAD/datasets/DDAD.tar | tar -xv -C /data/datasets/ 图片 But I have already done the commend mkdir /data/datasets/, the folder is showed below. 图片 Just as the picture shows, I downloaded KITTI but when I ran the commend python3 scripts/train.py configs/overfit_kitti.yaml the question still happened as below 图片

TheRustlessSummer commented 3 years ago

root@sun-MS-7A38:/workspace/packnet-sfm# python3 scripts/train.py configs/overfit_kitti.yaml

Preparing Model

Model: SelfSupModel DepthNet: DepthResNet PoseNet: PoseResNet

Preparing Datasets

Setup train datasets

Traceback (most recent call last): File "scripts/train.py", line 64, in train(args.file) File "scripts/train.py", line 53, in train model_wrapper = ModelWrapper(config, resume=ckpt, logger=logger) File "/workspace/packnet-sfm/packnet_sfm/models/model_wrapper.py", line 64, in init self.prepare_datasets(validation_requirements, test_requirements) File "/workspace/packnet-sfm/packnet_sfm/models/model_wrapper.py", line 91, in prepare_datasets self.model.train_requirements, augmentation) File "/workspace/packnet-sfm/packnet_sfm/models/model_wrapper.py", line 521, in setup_dataset dataset_args, **dataset_args_i, File "/workspace/packnet-sfm/packnet_sfm/datasets/kitti_dataset.py", line 106, in init with open(file_list, "r") as f: FileNotFoundError: [Errno 2] No such file or directory: '/data/datasets/KITTI_tiny/kitti_tiny.txt'

VitorGuizilini-TRI commented 3 years ago

I think you are creating data/datasets in the packnet-sfm root folder (packnet-sfm/data/datasets), it's supposed to be at the root of your system (/data/datasets), so the datasets don't get copied with the docker build. But it should work, if you change the paths accordingly (i.e. from /data/datasets to data/datasets)

TheRustlessSummer commented 3 years ago

I think you are creating data/datasets in the packnet-sfm root folder (packnet-sfm/data/datasets), it's supposed to be at the root of your system (/data/datasets), so the datasets don't get copied with the docker build. But it should work, if you change the paths accordingly (i.e. from /data/datasets to data/datasets)

Thanks for your help. I have another question. How to see the results of depth map and pointcloud?

VitorGuizilini-TRI commented 3 years ago

When running the eval.py script you can choose some save parameters in the .yaml config file. These will save the depth maps, then you can project them to 3D using the camera intrinsics to plot using your favorite visualizer. I am working on porting my own custom visualizer to this repository, but I'm not sure when it is going to land, probably more towards the end of the year.

TheRustlessSummer commented 3 years ago

Thanks for your suggestion. I have another question, the dataset DADD and KITTI you used is too large for me. Can you suggest some other datasets which is smaller than 50g to be training?

VitorGuizilini-TRI commented 3 years ago

As a starting point you can use our _tiny datasets, they are good enough for overfitting and to quickly try new ideas.

chaotianjiao commented 3 years ago

@TheRustlessSummer Hi~, Can you tell me how to change /data/datasets to data/datasets? Which .py should I modify? Great thanks~

VitorGuizilini-TRI commented 3 years ago

You can change the .yaml files directly.