⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
When the apiBase URL return a 502 (here fastchat is serving LLM models with nginx, but the server is down), the Intellij chat mode is freezing and the menus of the IDE are unreachable.
I managed to reproduce the behaviour of a webserver with a simple webserver in flask:
$ cat redirect.py
from flask import Flask, request, send_file, jsonify, redirect
import os
app = Flask(__name__)
@app.route('/chat/completions', methods = ['POST'])
def index():
return '', 502
$ cat run.sh
export FLASK_APP=redirect.py
flask run --host=0.0.0.0
$ ./run.sh
* Serving Flask app 'redirect.py'
* Debug mode: off
WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5000
* Running on http://10.9.29.124:5000
Press CTRL+C to quit
11.1.17.41 - - [14/Jun/2024 11:17:49] "POST /chat/completions HTTP/1.1" 502 -
Configure the continue plugin in .continue/config.json to use the webserver:
Configure a webserver as described in the Description
Copy the mysterious_function in the chat and ask a question like "what does this code do?"
Intellij should freeze
Log output
Code: undefined
Error number: undefined
Syscall: undefined
Type: undefined
Error: HTTP 502 BAD GATEWAY from http://127.0.0.1:5000/chat/completions
at customFetch (C:\snapshot\continue-deploy\binary\out\index.js:334510:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async withExponentialBackoff (C:\snapshot\continue-deploy\binary\out\index.js:334295:22)
at async OpenAI._streamChat (C:\snapshot\continue-deploy\binary\out\index.js:335029:22)
at async OpenAI.streamChat (C:\snapshot\continue-deploy\binary\out\index.js:334704:26)
at async llmStreamChat (C:\snapshot\continue-deploy\binary\out\index.js:359976:19)
at async C:\snapshot\continue-deploy\binary\out\index.js:360453:30
[2024-06-14T09:17:49] Error: HTTP 502 BAD GATEWAY from http://127.0.0.1:5000/chat/completions
[2024-06-14T09:17:49] Error running handler for "llm/streamChat": Error: HTTP 502 BAD GATEWAY from http://127.0.0.1:5000/chat/completions
[2024-06-14T09:17:49] Error: HTTP 502 BAD GATEWAY from http://127.0.0.1:5000/chat/completions
Before submitting your bug report
Relevant environment info
Description
When the apiBase URL return a 502 (here fastchat is serving LLM models with nginx, but the server is down), the Intellij chat mode is freezing and the menus of the IDE are unreachable.
I managed to reproduce the behaviour of a webserver with a simple webserver in flask:
Configure the continue plugin in
.continue/config.json
to use the webserver:To reproduce
Log output