Closed dlwh closed 2 weeks ago
previously we were padding to max tokenizer length, which is real bad with llama3
previously we were padding to max tokenizer length, which is real bad with llama3