Closed Hugoang08 closed 4 months ago
Hi there. This is likely not a problem in llmWrapper/Neuro. We can see in your second image that the LLM is only generating one token (the stop token), so it looks like the AI didn't give a response. Please verify that your API (specifically the chat endpoint) is functioning correctly with a tool like Bruno with the system prompt as the prompt. If the API is functioning correctly, you can use neurofrontend's "lobotomy" tab to look at the prompt that is being sent to the LLM. See if there is anything weird there.
Closing for staleness. Please feel free to reopen if you have further comments.
The request proses in the llmWrapper isn't giving any response. any suggestion on how to fix this?