taketwo / llm-ollama

LLM plugin providing access to local Ollama models using HTTP API
Apache License 2.0
110 stars 8 forks source link

Error: 'Options' object has no attribute 'model_dump' #6

Closed shiyongxin closed 5 months ago

shiyongxin commented 5 months ago

% llm -m llama3 "How much is 2+2?" Error: 'Options' object has no attribute 'model_dump'

taketwo commented 5 months ago

Hey @Taik, looks like there is an issue with the code we recently merged. Do you have any clue what may be the reason?

taketwo commented 5 months ago

By the way, I cannot reproduce this with the latest versions of packages:

Taik commented 5 months ago

Ah - sorry about this; I didn't test with a clean env. I believe this is because of pydantic. Will send a PR to fix this shortly.

You might have multiple llm plugins installed, and it just so happens that one of them has pydantic v2 as its dependency - hence why you're not seeing the issue (this was also my case).

taketwo commented 5 months ago

@shiyongxin I've tagged a new release. Please run llm install --upgrade llm-ollama to get the fix.