rinongal / textual_inversion

MIT License
2.87k stars 278 forks source link

Config Parameter to Control the number of iterations? #146

Open 2454511550Lin opened 1 year ago

2454511550Lin commented 1 year ago

Dear Author, I have a naive question, which parameter controls the number of iterations when training the embeddings? I learned from your paper that 5k is generally sufficient but I would like to do some experimenting myself.

I thought it is model.params.timesteps. I tried to change to 5, but the training did not stop after 5 epochs.

Any help would be greatly appreciated. Thank you!

2454511550Lin commented 1 year ago

I think figured it out. It is more about how to use PyTorch-lightning trainer. I think the parameter to control it is in

trainer:
  - max_steps: <# of steps I want>

Not sure about the difference between "step" vs "epoch". It seems like a "step" means a batch in one epoch? Anyway, thank you for maintaining the repository and I will keep learning :)