TRI-ML / packnet-sfm

TRI-ML Monocular Depth Estimation Repository
https://tri-ml.github.io/packnet-sfm/
MIT License
1.21k stars 241 forks source link

Error while training on custom data using Image dataset input loading #243

Open KoushikSamudrala opened 1 year ago

KoushikSamudrala commented 1 year ago

1) I am trying to train on my custom dataset using SelfSupModel but while training the model does not read the training images from the train split file and indicates 0 images there and read all the images in the testing set and outputs an error. It would be very kind of you if you can help me here. I am using Image_dataset as a loading input reference. 2)Also, how can we able to measure validation_loss from the code in this repo as validation_epoch_end returns **metrics_dict and not loss_and_metrics like the training loop? I'm a beginner to this programming and any help in this regard to estimating validation loss is helpful for me.

My config file looks like this: arch: max_epochs: 10 checkpoint: filepath: '/content/drive/MyDrive/packnetsfm/checkpoints/' monitor: 'abs_rel_pp_gt'

monitor_index: 0

save_top_k: -1
mode: 'min'

model: name: 'SelfSupModel' checkpoint_path: '/content/drive/MyDrive/packnetsfm/checkpoints/PackNet01_MR_velsup_CStoK.ckpt' optimizer: name: 'Adam' depth: lr: 0.0001 pose: lr: 0.0001 scheduler: name: 'StepLR' step_size: 30 gamma: 0.5 depth_net: name: 'PackNet01' version: '1A' pose_net: name: 'PoseNet' version: '' params: crop: 'garg' min_depth: 0.0 max_depth: 200.0 datasets: augmentation: image_shape: (384,640) train: batch_size: 3 dataset: ['Image'] path: ['/content/drive/MyDrive/data'] split: ['{:02}'] validation: dataset: ['Image'] path: ['/content/drive/MyDrive/data'] split: ['{:03}']