Open guo0O0o opened 3 months ago
Following sequence tagging Chemprot example gives errors on newer versions of HF transformers (currently on 4.41.2). Apparently model.save_pretrained('folder/') now saves model as .safetensors rather than .bin
model.save_pretrained('folder/')
.safetensors
.bin
hmmm... I went to replicate this now and I can't get it to fail.
From @guo0O0o : i think you can also just add the flag --save_safetensors False at the end of your script
Following sequence tagging Chemprot example gives errors on newer versions of HF transformers (currently on 4.41.2). Apparently
model.save_pretrained('folder/')
now saves model as.safetensors
rather than.bin