Open Dustinhoefer opened 1 year ago
A common reason of the error 400 is that the prompt exceed the limit of character allowed by the API.
I am asking to refactor this piece of code. I am pretty sure that it should not be too long :D
`private static void loadSprueche() { try { BufferedReader reader = new BufferedReader(new InputStreamReader(main.class.getResourceAsStream("/sprueche.txt")));
while(reader.ready()) {
alleSprueche.add(reader.readLine());
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}`
Maybe you are editing a very large file? The plugin will send the content of the entire file as a context in the prompt message. Also try cleaning the context with the "clean" button, otherwise the messages accumulate and eventually you may run out of the context window size.
Hi, I'm getting error 400 too, while asking to DOCUMENT or DISCUSS a snippet from a file of 80 lines.
Unable to run the task: java.lang.RuntimeException: java.io.IOException: Request failed with status code: 400 and response body: jdk.internal.net.http.ResponseSubscribers$HttpResponseInputStream@7b73f626 java.lang.RuntimeException: java.io.IOException: Request failed with status code: 400 and response body: jdk.internal.net.http.ResponseSubscribers$HttpResponseInputStream@7b73f626
Writing a question directly in the ChatGPT View seems to work.
Hi. I have a similar error. I manage to talk within the view but when i select a code to discuss with i have this error
Unable to run the task: java.lang.RuntimeException: java.io.IOException: Request failed with status code: 400 and response body: jdk.internal.net.http.ResponseSubscribers$HttpResponseInputStream@3ebfa718 java.lang.RuntimeException: java.io.IOException: Request failed with status code: 400 and response body: jdk.internal.net.http.ResponseSubscribers$HttpResponseInputStream@3ebfa718
I think i found a work arround
1-open the chatgpt view 2-say hello 3-use the assist ai plugin
When i don't manually open the view and directly call the plugin it just show the error
can confirm: the workaround was working for me as well. once i do a manual chat the HTTP 400 is gone afterwards.
I highly recommend using one of the GPT-4 turbo models, e.g. gpt-4-1106-preview. The GPT-4 turbo has an extended context window (128k tokens) which is essential for handling large source files. If you encounter 400 errors, the most probable cause is exceeding the context window limit. The LLM model has a maximum capacity that, when surpassed, results in these errors. For large source files the 4k token window of gpt4 is simply not enough.
Unable to run the task: java.lang.RuntimeException: java.io.IOException: Request failed with status code: 400 and response body: jdk.internal.net.http.ResponseSubscribers$HttpResponseInputStream@4fccaa0f java.lang.RuntimeException: java.io.IOException: Request failed with status code: 400 and response body: jdk.internal.net.http.ResponseSubscribers$HttpResponseInputStream@4fccaa0f
Using Model Name gpt-3.5-turbo