MathieuNlp / Sam_LoRA

Segment Your Ring (SYR) - Segment Anything model adapted with LoRA to segment rings.
MIT License
80 stars 16 forks source link

why sam not save when training ends. #9

Open Ultraman6 opened 1 month ago

Ultraman6 commented 1 month ago

Hi, I ran through your program. Why is it that only the lora weights are saved at the end of the fine tuning, the promt encoder and mask decoder downstream of sam are HOT parameters, don't they need to be saved? In your inference_eval.py also only the lora parameters are loaded, not considering the parameters downstream of sam at all!

MathieuNlp commented 3 weeks ago

Hi,

For the training checkpoints, you can read in the readme "limitation" part. It is explained why I chose to save at the end. For the loading of weights, I am freezing all weights of SAM (loading a SAM model is enough). Therefore I only need to load the LoRA checkpoint.