LiheYoung / Depth-Anything

[CVPR 2024] Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data. Foundation Model for Monocular Depth Estimation
https://depth-anything.github.io
Apache License 2.0
6.72k stars 516 forks source link

Fine-Tune #5

Open JamesWandy opened 7 months ago

JamesWandy commented 7 months ago

Will the code for fine-tuning the models be released? Thank you for your excellent work.

LiheYoung commented 7 months ago

Do you mean fine-tuning for in-domain and zero-shot metric depth estimation? We have released the training code here.

VimalMollyn commented 7 months ago

Thanks for the great work! Do you plan to release the pretraining code as well?

JamesWandy commented 7 months ago

@LiheYoung No I'm not interested in metric depth estimation. I mean fine tuning on new data to achieve better relative depth estimation.

sysuyl commented 7 months ago

hello, I also intertested in semantic segmentation. Will the code for fine-tuning segmentation be released. @LiheYoung

adam99goat commented 7 months ago

Hi, I am wondering whether it is possible to fine-tune on custom dataset for relative depth estimation as well.

jorgejgnz commented 4 months ago

I am also interested in fine-tuning this model with a custom dataset

cht619 commented 3 months ago

How to use the model in semantic segmentation?

rafaelagrc commented 3 months ago

Yes, I am also interested in finetuning for relative depth estimation

shilpaullas97 commented 2 months ago

@adam99goat , @jorgejgnz , @rafaelagrc

Did you get a chance to fine-tune the model for relative depth estimation? Also, is there any way to fine tune it for custom unlabeled dataset?