Closed Marcinj21 closed 4 months ago
Hey, can you call your api using https without twinny extension?
Yes, calling using https via curl https://.. working correctly and return answer for prompt.
Please check developer tools in vscode when twinny runs and paste the log, thanks.
Logs from a twinny.json from dev tools:
Hello I notice you're trying to use api.openai.com
as your hostname which isn't currently supported.
Many thanks,
'api.openai.com' is only my dns server name - we have local lan server with GPUs hosting Ollama in our LAN and with NGINX configured for HTTPS/SSL (with server name api.openai.com. Using HTTP and port 80 twinny is working.
So, I have to change the server name ?
Hey, sorry I am not actually sure in this case... @rcgtnick any advice, you seem to be the https/server expert! Many thanks.
@Marcinj21 Do you have access to your NGINX logs? If so, what is it logging when Twinny makes a request?
@rcgtnick
Logs from nginx debug when using Twinny on 443 twinny-nginx-debug.log
I also upload logs when using another extension for VSCode - Continue.dev. Continue refers to the same server as Twinny over HTTPS, and it works. continue-nginxdebug-logs.txt
This is my all nginx configuration: nginx-config.txt
Based on peer closed connection in SSL handshake while SSL handshaking
from your NGINX log, I'd say the client (the nodejs HTTP/HTTPS library) wasn't happy with the TLS negotiation. My next steps would be:
tcpdump
on the server or client?On the client, you could run this (if MacOS):
sudo tcpdump -nni en0 port 443 and host your.ollama.endpoint.com -s0 -A
or this (if linux):
sudo tcpdump -nni any port 443 and host your.ollama.endpoint.com -s0 -A
Or, on the server, you could run this:
sudo tcpdump -nni any port 443 and host 1.2.3.4 -s0 -A
(where 1.2.3.4 is your IP)
Those will all show more or less the same data, so any of them work for this test.
The -A
will cause tcpdump to print packet contents, so you can look for plain HTTP request headers. If you see anything you can actually read (like Authorization:
or POST /...
), the traffic is not encrypted and the issue is on the client.
Every time I experienced the symptom you are seeing, it's been that the client tried to connect to a TLS endpoint with plain HTTP, but it's worth verifying.
You could also try using the openssl
command to check the HTTPS endpoint:
openssl s_client -connect your.ollama.endpoint.com:443
That will try a TLS negotiation and let you know if it worked. I don't think this is it, since you were able to connect with curl
, but you never know.
Stale
Hello, I have a problem with coniguration Twinny to using https/tls (i have ollama server with nginx) On http (port 80) twinny is working.
This is my json with twinny config:
Nginx configuration:
This is a error that I have when i try run prompt
Desktop (please complete the following information):