Some HF models do not have a pad token ID set, and transformers warns on every generation that it sets the tokenizer's EOS token as pad token - this fix introduces doing this preemptively to prevent the warnings.
Also: Add docstring for load_config_and_tokenizer()
Some HF models do not have a pad token ID set, and
transformers
warns on every generation that it sets the tokenizer's EOS token as pad token - this fix introduces doing this preemptively to prevent the warnings. Also: Add docstring for load_config_and_tokenizer()