Open sugary199 opened 4 months ago
Or did I make a mistake somewhere? I would be very grateful if you could give me some help!
Hi~ This is not the mistake in your side. It is because our released vq_ds16_t2i.pt doesn’t include optimizer parameters.
Hi~ This is not the mistake in your side. It is because our released vq_ds16_t2i.pt doesn’t include optimizer parameters.
Thank you for response! I was wondering if there are any plans to release an updated version with the necessary optimizer parameters? Alternatively, is it possible to train without theses parameters? If you could provide some guidance or assistance, it would be extremely helpful for my work.
I get the same problem when finetuning gptxl,
optimizer.load_state_dict(checkpoint["optimizer"])
KeyError: 'optimizer'
t2i_XL_stage1_256.pt I wonder if this .pt file has the same problem? @PeizeSun thanks for your attention!
@PeizeSun hi, I met the same problem when funetuning the t2i model:
Does vq_ds16_t2i.pt also need to be updated?