Closed adder closed 1 week ago
I noticed that condaforge was still on version 0.0.46. I updated with pip to the latest version 0.1.13 I still get the same error.
Can you try manually setting the tokenizer like they do here?
import llama_cpp
from outlines import generate, models
model = models.llamacpp(
"bartowski/Llama-3.2-1B-Instruct-GGUF",
"Llama-3.2-1B-Instruct-Q4_K_M.gguf",
tokenizer=llama_cpp.llama_tokenizer.LlamaHFTokenizer.from_pretrained(
"meta-llama/Llama-3.2-1B"
),
)
Thanks That is the solution. I acutally used instead
model = models.llamacpp(
"bartowski/Llama-3.2-1B-Instruct-GGUF", "Llama-3.2-1B-Instruct-Q4_K_M.gguf"
## tokenizer needs to be set manually
, tokenizer=llama_cpp.llama_tokenizer.LlamaHFTokenizer.from_pretrained(
"unsloth/Llama-3.2-1B-Instruct"
))
Since you need to use credentials to get things from the meta-llama repo
Perfect. Closing for now, glad it worked!
Describe the issue as clearly as possible:
When trying to use
generate.json
withllama 3.2
andllamaccp
I get a runtime error:RuntimeError: Cannot convert token ` �` (30433) to bytes: �"
Steps/code to reproduce the bug:
Expected result:
Error message:
Outlines/Python version information:
Version information 0.0.46
Context for the issue:
No response