Open Clad3815 opened 1 month ago
@krrishdholakia Also this error happen with the new version:
litellm-1 | 17:21:44 - LiteLLM:ERROR: ollama_chat.py:519 - LiteLLM.ollama(): Exception occured - sequence item 19: expected str instance, NoneType found
litellm-1 | Traceback (most recent call last):
litellm-1 | File "/usr/local/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 493, in ollama_async_streaming
litellm-1 | response_content = first_chunk_content + "".join(content_chunks)
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | TypeError: sequence item 19: expected str instance, NoneType found
Reproducible code:
const OpenAI = require("openai");
const openai = new OpenAI({
apiKey: "sk-1234",
baseURL: "http://localhost:4444/v1",
});
const tools = [
{
type: "function",
function: {
name: "get_delivery_date",
description: "Get the delivery date for a customer's order. Call this whenever you need to know the delivery date, for example when a customer asks 'Where is my package'",
parameters: {
type: "object",
properties: {
order_id: {
type: "string",
description: "The customer's order ID.",
},
},
required: ["order_id"],
additionalProperties: false,
},
}
}
];
const messages = [
{ role: "system", content: "You are a helpful customer support assistant. Use the supplied tools to assist the user." },
{ role: "user", content: "Hi, can you tell me the delivery date for my order? My order ID is 1234567890." }
];
async function main() {
const stream = await openai.beta.chat.completions.stream({
model: 'ollama_chat/llama3.2',
messages: messages,
tools: tools,
stream: true,
api_base: "http://host.docker.internal:11434" // Override LiteLLM config (Optional)
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
const chatCompletion = await stream.finalChatCompletion();
console.log(chatCompletion);
}
main();
I got the same error
What happened?
I'm using the docker version, with the version
v1.48.19-stable
. After a tool call I got the following errorRelevant log output
Twitter / LinkedIn details
No response