Thanks for this great library.
I tried to update the code to llama_index 0.10.3
@sourabhdesai I've seen that the branch "sour/update_li/" is trying to do that.
The app is working but there is a problem with the sub_qestions:
I activated the trace ti debug.
I've seen that with llama_index 0.9.7 we had something like:
backend-llama-app-fastapi-1 | [986e07e8-6f86-4e19-ac9c-5755e0e8ed35] Q: What are the biggest risks mentioned in the SEC 10-K filing for Amazon.com Inc. (AMZN) for the 2021 time period?
backend-llama-app-fastapi-1 | [986e07e8-6f86-4e19-ac9c-5755e0e8ed35] A: The SEC 10-K filing for Amazon.com Inc. [...] expansion of established companies into Amazon's market segments.
backend-llama-app-fastapi-1 | Got output: The biggest discussed risks in the SEC 10-K filing for Amazon.com Inc. (AMZN) for [...] Amazon's market segments.
backend-llama-app-fastapi-1 | ========================
backend-llama-app-fastapi-1 |
backend-llama-app-fastapi-1 | STARTING TURN 2
backend-llama-app-fastapi-1 | ---------------
backend-llama-app-fastapi-1 |
backend-llama-app-fastapi-1 | **********
backend-llama-app-fastapi-1 | Trace: chat
backend-llama-app-fastapi-1 | |_CBEventType.AGENT_STEP -> 6.839175 seconds
backend-llama-app-fastapi-1 | |_CBEventType.LLM -> 0.740034 seconds
backend-llama-app-fastapi-1 | |_CBEventType.FUNCTION_CALL -> 5.664771 seconds
backend-llama-app-fastapi-1 | |_CBEventType.QUERY -> 5.664454 seconds
backend-llama-app-fastapi-1 | |_CBEventType.LLM -> 1.326065 seconds
backend-llama-app-fastapi-1 | |_CBEventType.SUB_QUESTION -> 2.471 seconds
backend-llama-app-fastapi-1 | |_CBEventType.QUERY -> 2.470408 seconds
backend-llama-app-fastapi-1 | |_CBEventType.RETRIEVE -> 0.333445 seconds
backend-llama-app-fastapi-1 | |_CBEventType.EMBEDDING -> 0.142882 seconds
backend-llama-app-fastapi-1 | |_CBEventType.SYNTHESIZE -> 2.13674 seconds
backend-llama-app-fastapi-1 | |_CBEventType.TEMPLATING -> 5.2e-05 seconds
backend-llama-app-fastapi-1 | |_CBEventType.LLM -> 2.128845 seconds
backend-llama-app-fastapi-1 | |_CBEventType.SYNTHESIZE -> 1.865582 seconds
backend-llama-app-fastapi-1 | |_CBEventType.TEMPLATING -> 3e-05 seconds
backend-llama-app-fastapi-1 | |_CBEventType.LLM -> 1.863571 seconds
backend-llama-app-fastapi-1 | |_CBEventType.LLM -> 0.0 seconds
Now we have something like:
ckend-llama-app-fastapi-bis-1 | [adf2c5bd-a7d7-42f6-aa48-f8b00d8af260] Q: What are the biggest risks mentioned in the SEC 10-K filing for Amazon.com Inc. (AMZN) for the 2022 time period?
backend-llama-app-fastapi-bis-1 | [adf2c5bd-a7d7-42f6-aa48-f8b00d8af260] A: The SEC 10-K filing for Amazon.com Inc. [...] expansion of established companies into Amazon's market segments.
backend-llama-app-fastapi-bis-1 | Got output: The biggest discussed risks in the SEC 10-K filing for Amazon.com Inc. (AMZN) for [...] Amazon's market segments.
backend-llama-app-fastapi-bis-1 | ========================
backend-llama-app-fastapi-bis-1 |
backend-llama-app-fastapi-bis-1 | **********
backend-llama-app-fastapi-bis-1 | Trace: chat
backend-llama-app-fastapi-bis-1 | |_CBEventType.AGENT_STEP -> 4.52079 seconds
backend-llama-app-fastapi-bis-1 | |_CBEventType.LLM -> 0.762686 seconds
backend-llama-app-fastapi-bis-1 | |_CBEventType.FUNCTION_CALL -> 3.326152 seconds
backend-llama-app-fastapi-bis-1 | |_CBEventType.LLM -> 0.0 seconds
backend-llama-app-fastapi-bis-1 | **********
So the function get_metadata_from_event from messaging.py don't return any sub_question and nothing is displayed in the preview part of the front.
Hi,
Thanks for this great library. I tried to update the code to llama_index 0.10.3 @sourabhdesai I've seen that the branch "sour/update_li/" is trying to do that.
The app is working but there is a problem with the sub_qestions:
I activated the trace ti debug. I've seen that with llama_index 0.9.7 we had something like:
Now we have something like:
So the function get_metadata_from_event from messaging.py don't return any sub_question and nothing is displayed in the preview part of the front.
Do you have any idea clues on the issue? Any starting point on what to update?