The issue occurs in most LLM servers (Groq, Fireworks, OpenAI Azure, etc.) because preflight over redirect is not supported. This leads to compatibility issues when connecting to these servers.
Recommendation
To address this, I recommend updating the documentation and the code to support the protocol (either http or https) directly in the server address. Additionally, it would be beneficial to remove the port specification as it seems redundant and contributes to compatibility issues.
Reference
For reference, the relevant code section can be found here.
I am more than happy to send over a pull request if you're interested
Issue with Preflight Over Redirect in LLM Servers
Description
The issue occurs in most LLM servers (Groq, Fireworks, OpenAI Azure, etc.) because preflight over redirect is not supported. This leads to compatibility issues when connecting to these servers.
Recommendation
To address this, I recommend updating the documentation and the code to support the protocol (either
http
orhttps
) directly in the server address. Additionally, it would be beneficial to remove the port specification as it seems redundant and contributes to compatibility issues.Reference
For reference, the relevant code section can be found here.
I am more than happy to send over a pull request if you're interested