run-llama / create-llama

The easiest way to get started with LlamaIndex
MIT License
1.04k stars 132 forks source link

Handle errors that occur while streaming (NextJS and Python) #426

Open klei30 opened 4 days ago

klei30 commented 4 days ago

Description

While using the application created with npx create-llama@latest --pro with FastAPI, frontend, and Pinecone integration, we encountered multiple errors affecting both frontend JSON parsing and backend API connections.

Frontend Error

An unhandled runtime error occurs in the frontend when the application attempts to parse an error message that is not valid JSON.

Error Message:

Unhandled Runtime Error
SyntaxError: Unexpected token 'e', "network error" is not valid JSON

Source:

app\components\chat-section.tsx (18:18) @ parse

  16 |     onError: (error: unknown) => {
  17 |       if (!(error instanceof Error)) throw error;
> 18 |       alert(JSON.parse(error.message).detail);
     |                  ^  
  19 |     },
  20 |   });
  21 |   return (

Backend Error

In addition, a backend error occurs, likely due to connectivity issues when making requests to the OpenAI API, resulting in an APIConnectionError.

Backend Error Traceback:

    return await self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\PC\AppData\Local\pypoetry\Cache\virtualenvs\app-EUVYEV_Q-py3.11\Lib\site-packages\openai\_base_client.py", line 1596, in _request
    return await self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\PC\AppData\Local\pypoetry\Cache\virtualenvs\app-EUVYEV_Q-py3.11\Lib\site-packages\openai\_base_client.py", line 1606, in _request
    raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

Steps to Reproduce

  1. Use the npx create-llama@latest --pro template with FastAPI backend and frontend.
  2. Set up Pinecone for vector storage.
  3. Trigger the frontend error by causing a network issue.
  4. Observe the frontend JSON parsing error and backend connection error.

Expected Behavior

Environment

marcusschiesser commented 1 day ago

the error happened while streaming which is not implemented yet (neither NextJS nor Python)

klei30 commented 1 day ago

is the streaming also causing issues in duck duck search functionality ?

https://github.com/run-llama/create-llama/issues/427