noahzn / Lite-Mono

[CVPR2023] Lite-Mono: A Lightweight CNN and Transformer Architecture for Self-Supervised Monocular Depth Estimation
MIT License
540 stars 61 forks source link

Total Number of Parameters #36

Closed ArminMasoumian closed 1 year ago

ArminMasoumian commented 1 year ago

Thank you for sharing your great work. I want to print the total number of parameters but it seems it's given me the wrong numbers.

I added these two lines of codes in the trainer.py code after "print("Training is using:\n ", self.device)":

print("Total number of parameters to train:", len(self.parameters_to_train)) print("Total number of parameters to train Pose:", len(self.parameters_to_train_pose))

However here are the results I got: " Total number of parameters to train: 227, Total number of parameters to train Pose: 70"

Would you please let me know how can I print the total number of parameters for the whole training? Im not using any pre-trained model.

Here is the comment for training:

python train.py --data_path /media/armin/DATA/Lightweight/kitti_data --model_name mytrain --num_epochs 30 --num_workers 4 --batch_size 4 --lr 0.0001 5e-6 31 0.0001 1e-5 31

noahzn commented 1 year ago

Hello,

There is a function that you can use to compute the parameters and FLOPs of a model. You need to install thop first.

ArminMasoumian commented 1 year ago

Hello,

There is a function that you can use to compute the parameters and FLOPs of a model. You need to install thop first.

Thank you for your prompt response. I have an additional question regarding training the lite-mono-8m model from scratch. I'm unsure whether I need to include "--model lite-mono-8m" in my command line. Could you please clarify which of the following commands should be used for training the lite-mono-8m model from scratch with an image size of 1024x320? Command 1: "python train.py --data_path /media/armin/DATA/Lightweight/kitti_data --model_name mytrain --model lite-mono-8m --num_epochs 30 --num_workers 4 --batch_size 4 --height 320 --width 1024" or Command 2: "python train.py --data_path /media/armin/DATA/Lightweight/kitti_data --model_name mytrain --num_epochs 30 --num_workers 4 --batch_size 4 --height 320 --width 1024"

In addition, I would like to train my model using an image resolution of 1024x320, using the pre-trained ImageNet model you provided in this repository. However, the pre-trained ImageNet model available is specifically trained for an image resolution of 640x192. I'm wondering if it is possible to use the same pre-trained model for different image sizes, or if I need to create my own pre-trained model specifically for an image resolution of 1024x320.

noahzn commented 1 year ago
  1. Command 1 is correct. If you do not specify --model the default model is lite-mono.
  2. You don't need different ImageNet pre-trained weights for the resolution of 1024x320. The pre-trained weights were obtained by training on ImageNet using an input size of 256x256.
ArminMasoumian commented 1 year ago
  1. Command 1 is correct. If you do not specify --model the default model is lite-mono.
  2. You don't need different ImageNet pre-trained weights for the resolution of 1024x320. The pre-trained weights were obtained by training on ImageNet using an input size of 256x256.

Thank you so much!