Closed grzegorzj closed 1 month ago
@grzegorzj Saving the tokenzier or the processor occurs error since last few weeks. So, I've just erased to save it becuase it dosen't change the configs of it. Is the latest code still struggles with the error?
After commenting out the lines saving config & processor everything works smoothly. During inference I use original processor & config from allenai, seems to be working well - thank you!
Hi! First of all, huge thanks for creating this, incredible value. Thank you so much.
I've ran into an issue where, after a successful test training with
finetune.sh
andzero3.json
, during saving a checkpoint, I got:It saves a
tokenizer_config.json
that is 0 bytes and stops there.I've seen someone got a similar (error), but I frankly speaking have no clue why is this happening.
I understand that the tokenizer can't be dumped to JSON because there's non-serializable data - any clues on what could have changed it? Didn't change any code from the original repo.