Closed duarteocarmo closed 1 month ago
What happens when you don't pass the tokenizer? The endpoint should be returning text so a tokenizer shouldn't be necessary.
@lapp0:
# must have ollama running
import tiktoken
from outlines import generate, models
model = models.openai(
"phi3.5",
base_url="http://localhost:11434/v1",
api_key="ollama",
)
generator = generate.text(model)
result = generator("Question: What's 2+2? Answer:", max_tokens=100)
print(result)
Error:
KeyError: 'Could not automatically map phi3.5 to a tokeniser. Please use `tiktoken.get_encoding` to explicitly get the tokeniser you expect.'
@duarteocarmo thanks for pointing this out to me.
I've removed the unnecessary tokenizer requirement. Could you please try
pip install --force-upgrade git+https://github.com/lapp0/outlines@openai-structured-generation
@lapp0 - that solves it! Thanks for the help :)
Great news, glad to help!
Describe the issue as clearly as possible:
OpenAI errors out when trying to set tokenizer.
Steps/code to reproduce the bug:
Expected result:
Error message:
Outlines/Python version information:
Version information
Context for the issue:
No response nooo