Closed ehartford closed 3 months ago
yep, working on it, literally just whipped it together, and finding the cohere prompt template isn't doing what i want.
cat fff.py | python fff.py "Summarize" --temp 0.0 --model ./mlx-community_c4ai-command-r-plus-4bit
i hit a deadend for the night - kid time begins. something with piping in the values and there not being a default chat template in the tokenizer.
got it working. add_generation_prompt=True for when the chat template is missing in the tokenizer config is what fixed it.
uploading new commit
can you please show some examples in the readme?