Closed chandrasekharan-zipstack closed 2 months ago
Issues
3 New issues
0 Accepted issues
Measures
0 Security Hotspots
No data about Coverage
0.0% Duplication on New Code
@chandrasekharan-zipstack In screenshot 2 and 3, we need to show the error description without JSON. Is it achievable only if we get rid of general exception catch or is it because the error happens outside our code?
@Deepak-Kesavan the error \gets triggered from
openAI -> llama_index methods -> SDK -> Prompt Service
Ideally our LLM adapter should expose the .complete() method and the error should flow as
openAI -> llama_index methods -> Adapter -> SDK -> Prompt Service
What
prompt-service
to log exceptions similar tobackend
and wrap with a valid response formatprompt-service
FE status updates to include thedoc_name
backend
to parse and report this as an error, avoided suppressing errorsbackend
Why
How
backend
and also pushed some updates to the FECan this PR break any existing features. If yes, please list possible items. If no, please explain why. (PS: Admins do not merge the PR without this section filled)
Relevant Docs
Notes on Testing
Screenshots
Checklist
I have read and understood the [Contribution Guidelines]().