I am trying to use a custom openai endpoint model (basically a service with openai api proxy that runs a custom model behind the scene, pretty much like ollama but fully compatible with openai module).
I am having this error:
Task exception was never retrieved
future: <Task finished name='Task-14' coro=<generate_related_queries() done, defined at /home/hangyu5/Documents/Gitrepo-My/AIResearchVault/repo/LLMApp/farfalle/src/backend/related_queries.py:45> exception=AssertionError('Instructor does not support multiple tool calls, use List[Model] instead.')>
Traceback (most recent call last):
File "/home/hangyu5/Documents/Gitrepo-My/AIResearchVault/repo/LLMApp/farfalle/src/backend/related_queries.py", line 53, in generate_related_queries
related = await client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/instructor/client.py", line 273, in create
return await self.create_fn(
^^^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/instructor/patch.py", line 119, in new_create_async
response = await retry_async(
^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/instructor/retry.py", line 219, in retry_async
async for attempt in max_retries:
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/tenacity/_asyncio.py", line 123, in __anext__
do = await self.iter(retry_state=self._retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/tenacity/_asyncio.py", line 110, in iter
result = await action(retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/tenacity/_asyncio.py", line 78, in inner
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/tenacity/__init__.py", line 410, in exc_check
raise retry_exc.reraise()
^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/tenacity/__init__.py", line 183, in reraise
raise self.last_attempt.result()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/instructor/retry.py", line 226, in retry_async
return await process_response_async(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/instructor/process_response.py", line 75, in process_response_async
model = response_model.from_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/instructor/function_calls.py", line 115, in from_response
return cls.parse_tools(completion, validation_context, strict)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hangyu5/anaconda3/envs/farfalle/lib/python3.11/site-packages/instructor/function_calls.py", line 201, in parse_tools
len(message.tool_calls or []) == 1
AssertionError: Instructor does not support multiple tool calls, use List[Model] instead.
Simply want to confirm that it is my model that cannot proceed with these function calls, nothing to do with instructor or farfalle, right?
Hi,
I am trying to use a custom openai endpoint model (basically a service with openai api proxy that runs a custom model behind the scene, pretty much like ollama but fully compatible with openai module).
I am having this error:
Simply want to confirm that it is my model that cannot proceed with these function calls, nothing to do with instructor or farfalle, right?
Thanks!