Closed chaos369 closed 5 months ago
Thank you.
I've opened a PR on litellm that should fix this https://github.com/BerriAI/litellm/pull/3469 Just need to bump the litellm dependency version in magentic once that's merged and released.
@chaos369 @DevAseel I have just published https://github.com/jackmpcollins/magentic/releases/tag/v0.23.0 which enables structured outputs and function calling with ollama 🦙 Depending on what model you use you might need to prompt it to "use the tool" or add more details of the response format in the prompt.
Example
from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel
@prompt(
"Count to {n}. Use the tool to return in the format [1, 2, 3, ...]",
model=LitellmChatModel("ollama_chat/llama2", api_base="http://localhost:11434")
)
def count_to(n: int) -> list[int]: ...
count_to(5)
# > [1, 2, 3, 4, 5]
Please let me know if you hit any issues with this. Thanks
It seems hardly to return an object. @jackmpcollins
wizardlm2=LitellmChatModel("ollama/wizardlm2:7b") llama3 = LitellmChatModel("ollama/llama3:instruct") gpt35=LitellmChatModel(model="openai/gpt-3.5-turbo")
llm_model = wizardlm2
class Superhero(BaseModel): name: str age: int power: str enemies: list[str]
@prompt("Create a Superhero named {name}, use the tool to return a Superhero struct.", model=llm_model) def create_superhero(name: str) -> Superhero: ...
hero = create_superhero("Garden Man")
Traceback (most recent call last):
File "/home/mac/workspace/projects/mag_hero.py", line 23, in
Traceback (most recent call last):
File "/home/mac/workspace/projects/mag_hero.py", line 23, in read()
.
@chaos369 I think the ResponseNotRead
issue here could be caused by Ollama. Could you make sure it is fully up-to-date and that you have the llama3
model pulled/downloaded.
The first issue is caused by the model returning an invalid function name, createGardenMan
instead of return_superhero
which is expected by magentic. I've tried several prompts with llama3 to fix this but it is very unreliable. I think the solution will required updates to ollama and litellm. I've opened a new magentic issue to track this https://github.com/jackmpcollins/magentic/issues/207 so please follow that.
@chaos369 Looks like ollama does support tool calls, but litellm currently has some bugs with how it is parsing these. I've opened an issue there https://github.com/BerriAI/litellm/issues/3333 When that is fixed you should be able to use ollama models via the
LitellmChatModel
in magentic by using the"ollama_chat/"
prefix (instead of"ollama/"
) in the model name.Something like this (based on example from https://github.com/jackmpcollins/magentic/issues/50#issuecomment-1794099320)
@knoopx you might be interested in following this issue