YufeiWang777 / LRRU

Official implementation of ``LRRU: Long-short Range Recurrent Updating Networks for Depth Completion'', ICCV 2023.
83 stars 4 forks source link

The size of the pre-trained parameters does not correspond to what is mentioned in the paper. #19

Open disco14 opened 5 months ago

disco14 commented 5 months ago

Dear Author,

Thank you very much for your outstanding work. However, I have encountered an issue. In your paper, you mentioned the sizes of the pre-trained parameters as 0.3M, 1.3M, 5M, and 21M. However, after downloading the parameters you provided, I found their sizes to be 1.5M, 5.4M, 21.1M, and 83.6M, respectively, which do not correspond to those mentioned in the paper. When loaded with PyTorch, the parameters also appear to be much larger. Could you please clarify the reason for this discrepancy?

Best regards.

YufeiWang777 commented 5 months ago

Hi, the parameters descirbed in the paper (0.3M, 1.3M, 5M, and 21M) is the number of the model parameter. I don't know how these numbers (1.5M, 5.4M, 21.1M, and 83.6M) are obtained. are these the storage space on disk?