Open Rainismer opened 1 week ago
Hey @Rainismer, can you try running pip install -U openai phidata
@manthanguptaa They are all the latest. But the problem still exists.
Can you try removing base_url
and try again?
@manthanguptaa If using the native OpenAI API there is no problem, but in the company we must use a self-hosted API, so please have your team fix the proxy error as soon as possible.
@manthanguptaa You can refer to litellm's approach
`from phi.agent import Agent from phi.model.openai import OpenAIChat from phi.tools.duckduckgo import DuckDuckGo
llm_4o = OpenAIChat( temperature=0, id="gpt-4o", api_key="xxxxxxxxxxxxxxxxxxxx=", base_url="http://10.100.100.100:10000/v1", )
web_agent = Agent( name="Web Agent", model=llm_4o, tools=[DuckDuckGo()], instructions=["Always include sources"], show_tool_calls=True, markdown=True, ) web_agent.print_response("Whats happening in France?", stream=True)`
Traceback (most recent call last): File "D:\AIGC\idataai_py\test\experiment.py", line 49, in
web_agent.print_response("Whats happening in France?", stream=True)
File "D:\AIGC\idataai_py\venv\Lib\site-packages\phi\agent\agent.py", line 2605, in print_response
for resp in self.run(message=message, messages=messages, stream=True, kwargs):
File "D:\AIGC\idataai_py\venv\Lib\site-packages\phi\agent\agent.py", line 1610, in _run
for model_response_chunk in self.model.response_stream(messages=messages_for_model):
File "D:\AIGC\idataai_py\venv\Lib\site-packages\phi\model\openai\chat.py", line 836, in response_stream
_tool_calls = self._build_tool_calls(stream_data.response_tool_calls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AIGC\idataai_py\venv\Lib\site-packages\phi\model\openai\chat.py", line 941, in _build_tool_calls
if len(tool_calls) <= _index:
^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: '<=' not supported between instances of 'int' and 'NoneType'**