n8n-io / n8n

Free and source-available fair-code licensed workflow automation tool. Easily automate tasks across different services.
https://n8n.io
Other
42.26k stars 5.46k forks source link

Custom OpenAI Endpoint does not work with non OpenAI models #9862

Open JamesClarke7283 opened 5 days ago

JamesClarke7283 commented 5 days ago

Bug Description

When you deploy a openai chat model with a custom base api url, it only shows OpenAI models, not the ones available to that endpoint.

this is seen in version 1.47.0.

To Reproduce

  1. create a AI agent or basic LLM chain, anything that needs a chat model.
  2. add a openAI chat model to it, and specify a different base URL, like "https://openrouter.ai/api/v1/" which is what i tested it on.
  3. Try and look at the listed models, it says "The value "gpt-3.5-turbo" is not supported!"

Expected behavior

it should pull the list of models from the endpoint, weather or not its the official OpenAI endpoint.

Operating System

Archlinux

n8n Version

1.47.0

Node.js Version

v22.3.0

Database

SQLite (default)

Execution mode

main (default)

Joffcom commented 5 days ago

Hey @JamesClarke7283,

It looks like the node does try to fetch the models but we filter them to look for any that start with gpt- or includes instruct, We should probably update the node to remove the filter if using a different base url which would solve the problem.

As a workaround you can change the model field to use an expression then type in the name of the model you want to use.

We are tracking this internally as AI-204.