Closed shiyongxin closed 5 months ago
Hey @Taik, looks like there is an issue with the code we recently merged. Do you have any clue what may be the reason?
By the way, I cannot reproduce this with the latest versions of packages:
llm==0.14
llm-ollama==0.3.0
ollama==0.1.38
Ah - sorry about this; I didn't test with a clean env. I believe this is because of pydantic. Will send a PR to fix this shortly.
You might have multiple llm plugins installed, and it just so happens that one of them has pydantic v2 as its dependency - hence why you're not seeing the issue (this was also my case).
@shiyongxin I've tagged a new release. Please run llm install --upgrade llm-ollama
to get the fix.
% llm -m llama3 "How much is 2+2?" Error: 'Options' object has no attribute 'model_dump'