Closed HAL9KKK closed 2 weeks ago
When I upgraded the llama-index-core version from 0.10.51 to 0.10.52.post1, it resolved issue #93, but the same error as in this issue appeared.
This is my log message
INFO:llama_agents.message_queues.simple - Consumer AgentService-937a8c9d-7164-47e8-acc1-ff5440d1b4c9: dumb_fact_agent has been registered.
INFO:llama_agents.message_queues.simple - Consumer c5892a8a-fa83-447a-950c-fbd3f924d676: human has been registered.
INFO:llama_agents.message_queues.simple - Consumer ControlPlaneServer-1c6e7b50-23c6-48e0-afa8-858e77ef2b9d: control_plane has been registered.
INFO:llama_agents.services.agent - dumb_fact_agent launch_local
INFO:llama_agents.message_queues.base - Publishing message to 'control_plane' with action 'ActionTypes.NEW_TASK'
INFO:llama_agents.message_queues.simple - Launching message queue locally
D:\ProgramData\Miniconda3\envs\test\Lib\site-packages\llama_index\core\llms\llm.py:684: RuntimeWarning: coroutine 'Dispatcher.span.<locals>.async_wrapper' was never awaited
output = AgentChatResponse(
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
INFO:llama_agents.message_queues.base - Publishing message to 'human' with action 'ActionTypes.COMPLETED_TASK'
INFO:llama_agents.message_queues.simple - Successfully published message 'control_plane' to consumer.
INFO:llama_agents.message_queues.simple - Successfully published message 'human' to consumer.
Result: An error occurred while running the tool: 'coroutine' object has no attribute 'output'
The following are the packages I currently have installed and their corresponding versions | packages name | version |
---|---|---|
llama-agents | 0.0.4 | |
llama-cloud | 0.0.6 | |
llama-index | 0.10.52 | |
llama-index-agent-openai | 0.2.7 | |
llama-index-cli | 0.1.12 | |
llama-index-core | 0.10.52.post1 | |
llama-index-embeddings-gemini | 0.1.8 | |
llama-index-embeddings-openai | 0.1.10 | |
llama-index-indices-managed-llama-cloud | 0.2.2 | |
llama-index-legacy | 0.9.48 | |
llama-index-llms-gemini | 0.1.11 | |
llama-index-llms-openai | 0.1.23 | |
llama-index-multi-modal-llms-openai | 0.1.6 | |
llama-index-program-openai | 0.1.6 | |
llama-index-question-gen-openai | 0.1.3 | |
llama-index-readers-file | 0.1.25 | |
llama-index-readers-llama-parse | 0.1.4 | |
llama-parse | 0.4.4 |
Yea that PR wasn't meant to fix this one. I see the issue for this as well, but ngl open-source models suck at being agents, so struggling to confirm if I fixed it yet fully or llama3 just sucks 😁
This is fixed in llama-index-core
v0.10.52.post2
pip install -U llama-index-core
to get it
I am using LMstudio and I experiment a lot of frustrations. I would advice to publish and example how to run it locally with LMStudio or Ollama
Say that I encountered this error with this code:
this is the result