Open 2454511550Lin opened 1 year ago
I think figured it out. It is more about how to use PyTorch-lightning trainer. I think the parameter to control it is in
trainer:
- max_steps: <# of steps I want>
Not sure about the difference between "step" vs "epoch". It seems like a "step" means a batch in one epoch? Anyway, thank you for maintaining the repository and I will keep learning :)
Dear Author, I have a naive question, which parameter controls the number of iterations when training the embeddings? I learned from your paper that 5k is generally sufficient but I would like to do some experimenting myself.
I thought it is
model.params.timesteps
. I tried to change to 5, but the training did not stop after 5 epochs.Any help would be greatly appreciated. Thank you!