use .chat_completion instead of .completion when parsing Model.ask as it uses the chat endpoint
Some of the LLM models have identical chat_completion and completion (eg: OpenAI), while others have different response structures returned from the API (eg: Ollama)
use
.chat_completion
instead of.completion
when parsingModel.ask
as it uses the chat endpointSome of the LLM models have identical
chat_completion
andcompletion
(eg: OpenAI), while others have different response structures returned from the API (eg: Ollama)