TRI-ML / packnet-sfm

TRI-ML Monocular Depth Estimation Repository
https://tri-ml.github.io/packnet-sfm/
MIT License
1.23k stars 242 forks source link

Custom training dataset format #186

Open UditSinghParihar opened 2 years ago

UditSinghParihar commented 2 years ago

Hello Sir, Thanks for providing the code for packnet-sfm.

I am currently trying to train on custom Carla dataset. My yaml configuration file looks like:

checkpoint:
    filepath: '/workspace/packnet-sfm/results/checkpoints'
    save_top_k: 5

save:
    folder: '/workspace/packnet-sfm/results'

model:
    name: 'SelfSupModel'
    optimizer:
        name: 'Adam'
        depth:
            lr: 0.0002
        pose:
            lr: 0.0002
    scheduler:
        name: 'StepLR'
        step_size: 30
        gamma: 0.5
    depth_net:
        name: 'PackNet01'
        version: '1A'
    pose_net:
        name: 'PoseNet'
        version: ''
    params:
        crop: 'garg'
        min_depth: 0.0
        max_depth: 80.0

datasets:
    augmentation:
        image_shape: (256, 320)
    train:
        batch_size: 2
        dataset: ['Image']
        path: ['/data/datasets/carla/Town01_short/carla_test/train']
        split: ['{:04}']
        repeat: [1]
    validation:
        dataset: ['Image']
        path: ['/data/datasets/carla/Town01_short/carla_test/val']
        split: ['{:04}']  
    test:
        dataset: ['Image']
        path: ['/data/datasets/carla/Town01_short/carla_test/val']
        split: ['{:04}']

My directory structure looks like:

.
├── data_splits
├── train
└── val

However during training, my training-loss is non-zero, while my validation-loss is zero, I suspect validation-loss being zero might be due to wrong data loading, could you help me help out what is the correct directory structure for using image_dataset.py and what should be path and split field in the config file? My issue is similar to this issue

Tloops commented 2 years ago

Hello, I encountered the same problem as yours. Have you solved this problem?