princeton-nlp / MeZO

[NeurIPS 2023] MeZO: Fine-Tuning Language Models with Just Forward Passes. https://arxiv.org/abs/2305.17333
MIT License
1.02k stars 60 forks source link

question about MeZO-adam #37

Open zhaoaustin opened 3 weeks ago

zhaoaustin commented 3 weeks ago

Hi! I find MeZO-adam code in medium size folder, but it uses the Adam from pytorch.optim. Its not like the case in large_models that author re-write the inner_loop. Can you please explain it? Thank you every much for your time and efforts.

gaotianyu1350 commented 3 weeks ago

Hi,

We found in the medium model experiments that Adam is only comparable or worse compared to SGD. Hence we only ran it as an ablation (on the medium-sized models) and did not implement the efficient version.