File "/home/hsai/works/code-llama-fim-fine-tuning/dataset.py", line 45, in init
tokenizer.eot_token, add_special_tokens=False
AttributeError: 'LlamaTokenizerFast' object has no attribute 'eot_token'. Did you mean: 'eos_token'?
Hi @hwaking! Thank you for reporting this! I think it should be a problem with breaking changes in dependencies. I'll be able to take a look at it on weekends.
Please, let me know if you find a solution on your own!
File "/home/hsai/works/code-llama-fim-fine-tuning/dataset.py", line 45, in init tokenizer.eot_token, add_special_tokens=False AttributeError: 'LlamaTokenizerFast' object has no attribute 'eot_token'. Did you mean: 'eos_token'?