BoltAI / BoltAI

BoltAI issue tracker
https://boltai.app
11 stars 1 forks source link

BoltAI not respecting port of "OpenAl-compatible Server (beta)" configuration #194

Closed maxaudet closed 3 weeks ago

maxaudet commented 1 month ago

Describe the bug The specified port is not being used when attempting to reach the API for an" OpenAI-compatible" configuration.

To Reproduce Steps to reproduce the behavior:

  1. Create a new "OpenAl-compatible Server"
  2. Configure it to use either LiteLLM or Ollama for example
  3. Specify a port !=80 in the API endpoint
  4. Click "Verify and Save", it will produce an error

Expected behavior BoltAI should use the specified port when reaching the API.

Screenshots BoltAI config: bolt_error

ProxyMan showing the same behaviour: proxyman_bolt

Compared to the same call in Postman, which correctly uses the specified port: proxyman_postman

Version (please complete the following information):

Additional context This seems new as it used to work, but I don't know in which version it changed.

longseespace commented 1 month ago

Thanks. I will fix this soon.

longseespace commented 3 weeks ago

This has been fixed in v1.26.1. It's will be live on Setapp soon (waiting for approval)