continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://continue.dev/docs
Apache License 2.0
13.11k stars 893 forks source link

Intellij IDE freeze when the upstream URL returns a 502 (models unavailable) #1483

Open zoobab opened 2 weeks ago

zoobab commented 2 weeks ago

Before submitting your bug report

Relevant environment info

- OS: Windows 10 build 19045
- Continue: 0.0.50
- IDE: Intellij idealC-2023.2.5.win

Description

When the apiBase URL return a 502 (here fastchat is serving LLM models with nginx, but the server is down), the Intellij chat mode is freezing and the menus of the IDE are unreachable.

I managed to reproduce the behaviour of a webserver with a simple webserver in flask:

$ cat redirect.py 
from flask import Flask, request, send_file, jsonify, redirect
import os
app = Flask(__name__)

@app.route('/chat/completions', methods = ['POST'])
def index():
    return '', 502

$ cat run.sh 
export FLASK_APP=redirect.py
flask run --host=0.0.0.0

$ ./run.sh 
 * Serving Flask app 'redirect.py'
 * Debug mode: off
WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
 * Running on all addresses (0.0.0.0)
 * Running on http://127.0.0.1:5000
 * Running on http://10.9.29.124:5000
Press CTRL+C to quit
11.1.17.41 - - [14/Jun/2024 11:17:49] "POST /chat/completions HTTP/1.1" 502 -

Configure the continue plugin in .continue/config.json to use the webserver:

{
  "models": [
    {
      "title": "DeepSeek-33B",
      "model": "deepseek-coder-33b-instruct",
      "apiBase": "http://127.0.0.1:5000",

To reproduce

  1. Configure a webserver as described in the Description
  2. Copy the mysterious_function in the chat and ask a question like "what does this code do?"
  3. Intellij should freeze

Log output

Code: undefined
Error number: undefined
Syscall: undefined
Type: undefined

Error: HTTP 502 BAD GATEWAY from http://127.0.0.1:5000/chat/completions

    at customFetch (C:\snapshot\continue-deploy\binary\out\index.js:334510:17)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async withExponentialBackoff (C:\snapshot\continue-deploy\binary\out\index.js:334295:22)
    at async OpenAI._streamChat (C:\snapshot\continue-deploy\binary\out\index.js:335029:22)
    at async OpenAI.streamChat (C:\snapshot\continue-deploy\binary\out\index.js:334704:26)
    at async llmStreamChat (C:\snapshot\continue-deploy\binary\out\index.js:359976:19)
    at async C:\snapshot\continue-deploy\binary\out\index.js:360453:30
[2024-06-14T09:17:49] Error: HTTP 502 BAD GATEWAY from http://127.0.0.1:5000/chat/completions

[2024-06-14T09:17:49] Error running handler for "llm/streamChat":  Error: HTTP 502 BAD GATEWAY from http://127.0.0.1:5000/chat/completions

[2024-06-14T09:17:49] Error: HTTP 502 BAD GATEWAY from http://127.0.0.1:5000/chat/completions
zoobab commented 2 weeks ago

I tried the same setup in VSCode (1.86.2) with continue extension (0.8.31), the chat box quickly show some colors, but does not throw an error.