twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.36k stars 130 forks source link

Problem with https/tls configuration - #161

Closed Marcinj21 closed 4 months ago

Marcinj21 commented 4 months ago

Hello, I have a problem with coniguration Twinny to using https/tls (i have ollama server with nginx) On http (port 80) twinny is working.

This is my json with twinny config:

{
    "twinny.apiHostname": "ollama.example.com",
    "twinny.chatModelName": "llama2:7b",
    "twinny.fimModelName": "wizardcoder:7b-python",
    "twinny.fimApiPath": "/api/generate",
    "twinny.chatApiPath": "/v1/chat/completions",
    "workbench.settings.applyToAllProfiles": [
        "twinny.disableAutoSuggest"
    ],
    "twinny.chatApiPort": 443,
    "twinny.fimApiPort": 443,
    "twinny.useTls": true
}

Nginx configuration:

server {
    listen 80;
    server_name example.com; # Change this to your domain name

    # Redirect HTTP to HTTPS
    return 301 https://$server_name$request_uri;
}

server {
    listen 443 ssl;
    server_name example.com; # Change this to your domain name

    # SSL configuration
    ssl_certificate /path/to/your/ssl/certificate.crt;
    ssl_certificate_key /path/to/your/ssl/privatekey.key;

    # Other SSL configurations like ssl_protocols, ssl_ciphers, etc. can be added here

    location / {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
    }
}

This is a error that I have when i try run prompt

2024-03-04 10:22:35.714 [error] [Extension Host] Fetch error: TypeError: fetch failed at Object.fetch (node:internal/deps/undici/undici:11576:11) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async e.streamResponse (c:\Users\adm_mjalmuzna.vscode\extensions\rjmacarthy.twinny-3.7.8\out\index.js:2:121574)

Desktop (please complete the following information):

rjmacarthy commented 4 months ago

Hey, can you call your api using https without twinny extension?

Marcinj21 commented 4 months ago

Yes, calling using https via curl https://.. working correctly and return answer for prompt.

rjmacarthy commented 4 months ago

Please check developer tools in vscode when twinny runs and paste the log, thanks.

Marcinj21 commented 4 months ago

Logs from a twinny.json from dev tools:

twinny.json

rjmacarthy commented 4 months ago

Hello I notice you're trying to use api.openai.com as your hostname which isn't currently supported.

Many thanks,

Marcinj21 commented 4 months ago

'api.openai.com' is only my dns server name - we have local lan server with GPUs hosting Ollama in our LAN and with NGINX configured for HTTPS/SSL (with server name api.openai.com. Using HTTP and port 80 twinny is working.

So, I have to change the server name ?

rjmacarthy commented 4 months ago

Hey, sorry I am not actually sure in this case... @rcgtnick any advice, you seem to be the https/server expert! Many thanks.

oxaronick commented 4 months ago

@Marcinj21 Do you have access to your NGINX logs? If so, what is it logging when Twinny makes a request?

Marcinj21 commented 4 months ago

@rcgtnick

Logs from nginx debug when using Twinny on 443 twinny-nginx-debug.log

I also upload logs when using another extension for VSCode - Continue.dev. Continue refers to the same server as Twinny over HTTPS, and it works. continue-nginxdebug-logs.txt

This is my all nginx configuration: nginx-config.txt

oxaronick commented 4 months ago

Based on peer closed connection in SSL handshake while SSL handshaking from your NGINX log, I'd say the client (the nodejs HTTP/HTTPS library) wasn't happy with the TLS negotiation. My next steps would be:

  1. verifying that the client is actually trying to connect with HTTPS and not plain HTTP. Can you use tcpdump on the server or client?
  2. verifying the HTTPS endpoint

Verifying client is attempting HTTPS connection

On the client, you could run this (if MacOS):

sudo tcpdump -nni en0 port 443 and host your.ollama.endpoint.com -s0 -A

or this (if linux):

sudo tcpdump -nni any port 443 and host your.ollama.endpoint.com -s0 -A

Or, on the server, you could run this:

sudo tcpdump -nni any port 443 and host 1.2.3.4 -s0 -A (where 1.2.3.4 is your IP)

Those will all show more or less the same data, so any of them work for this test.

The -A will cause tcpdump to print packet contents, so you can look for plain HTTP request headers. If you see anything you can actually read (like Authorization: or POST /...), the traffic is not encrypted and the issue is on the client.

Every time I experienced the symptom you are seeing, it's been that the client tried to connect to a TLS endpoint with plain HTTP, but it's worth verifying.

Verifying the HTTPS endpoint

You could also try using the openssl command to check the HTTPS endpoint:

openssl s_client -connect your.ollama.endpoint.com:443

That will try a TLS negotiation and let you know if it worked. I don't think this is it, since you were able to connect with curl, but you never know.

rjmacarthy commented 4 months ago

Stale