Open griftt opened 7 months ago
Im having the same issue, if I use nginx as a simple reverse proxy in front of ollama it works fine, but when I try litellm (to try and use tools) I get this 500 error, I wonder if its a timeout in litellm.
Are you getting this on the litellm log?
raise APIConnectionError( litellm.exceptions.APIConnectionError: 'eval_count'
Please use a more descriptive title for the issue.
You also need to provide the model you are using, pyautogen version, litellm/ollama setup the code snippet that leads to the error.
pyautogen version: 0.2.21
Tested on: ollama/mistral:latest ollama/mistral:instruct ollama/gemma:latest
The issue triggers only when I include functions on the llm_config:
Working code:
llm_config = {
"config_list": [
{
"model": "NotRequired", # Loaded with LiteLLM command
"api_key": "NotRequired", # Not needed
"base_url": "http://192.168.x.x:4000" # Your LiteLLM URL
}
],
"cache_seed": None, # Turns off caching, useful for testing different models
}
Not working code:
llm_config_func = {
"config_list": [
{
"model": "NotRequired", # Loaded with LiteLLM command
"api_key": "NotRequired", # Not needed
"base_url": "http://192.168.x.x:4000" # Your LiteLLM URL
}
],
"cache_seed": None, # Turns off caching, useful for testing different models
"functions": [search_declaration]
}
Agents:
assistant = AssistantAgent('assistant',system_message="You are a helpful assistant, end all your answers with TERMINATE",llm_config=llm_config)
user_proxy = UserProxyAgent('user_proxy', is_termination_msg=is_termination_msg, human_input_mode='NEVER', function_map={"Search": search})
groupchat = GroupChat(agents=[assistant, user_proxy], speaker_selection_method="round_robin", messages=[])
groupchat_manager = GroupChatManager(groupchat, llm_config=llm_config)
Search declaration and code (I tested it alone and it works):
from datetime import datetime, timedelta
import json
import requests
import os
search_declaration = {
"name": "Search",
"description": "Search at Google and returns Title, Link, and Snippet",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "The search query"
}
},
"required": ["query"]
},
}
def search(query):
# Calculate the date 6 months ago from today
six_months_ago = datetime.now() - timedelta(days=36*30) # approximating 3 years
date_str = six_months_ago.strftime('%Y-%m-%d')
# Append the date filter to the query
query = f"{query} after:{date_str}"
url = "https://google.serper.dev/search"
# Get the API key from an environment variable
api_key = os.environ.get('GOOGLE_SERPER_API_KEY')
if not api_key:
raise ValueError(
"Environment variable 'GOOGLE_SERPER_API_KEY' not set!")
payload = json.dumps({
"q": query
})
headers = {
'X-API-KEY': api_key,
'Content-Type': 'application/json'
}
response = requests.request("POST", url, headers=headers, data=payload)
return response.json()
Error:
openai.InternalServerError: Error code: 500 - {'error': {'message': "'eval_count'", 'type': None, 'param': None, 'code': 500}}
May be issue with model not supporting function call or the legacy functions API style you are using. Can you include your litellm logs?
A few points to note:
Describe the issue
question:I met this question when I run the demo of the tool call ,
Traceback (most recent call last): File "/mnt/workspace/chat-autogen/util/autogenUtil/SimpleChat.py", line 99, in caht.initSimeDoubleChat("1+11等于多少") File "/mnt/workspace/chat-autogen/util/autogenUtil/SimpleChat.py", line 93, in initSimeDoubleChat host.initiate_chat(chinese,message="帮我写程序计算从1 加到100 总和是多少", max_turns=4) File "/opt/conda/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 980, in initiate_chat self.send(msg2send, recipient, request_reply=True, silent=silent) File "/opt/conda/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 629, in send recipient.receive(message, self, request_reply, silent) File "/opt/conda/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 788, in receive reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender) File "/opt/conda/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 1876, in generate_reply final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"]) File "/opt/conda/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 1275, in generate_oai_reply extracted_response = self._generate_oai_reply_from_client( File "/opt/conda/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 1294, in _generate_oai_reply_from_client response = llm_client.create( File "/opt/conda/lib/python3.10/site-packages/autogen/oai/client.py", line 623, in create response = client.create(params) File "/opt/conda/lib/python3.10/site-packages/autogen/oai/client.py", line 276, in create response = completions.create(*params) File "/opt/conda/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper return func(args, **kwargs) File "/opt/conda/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 667, in create return self._post( File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 1208, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 897, in request return self._request( File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 973, in _request return self._retry_request( File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 973, in _request return self._retry_request( File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 988, in _request raise self._make_status_error_from_response(err.response) from None openai.InternalServerError: Error code: 500 - {'error': {'message': "'eval_count'", 'type': None, 'param': None, 'code': 500}}
Steps to reproduce
No response
Screenshots and logs
No response
Additional Information
No response