langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
46.76k stars 6.61k forks source link

Multiple OpenAI Token Support #8832

Closed racerxdl closed 2 days ago

racerxdl commented 2 days ago

Self Checks

Dify version

0.8.0

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

The thing is that when adding a model you can select an OpenAI Token: image

But there is also a "global API token" at the OpenAI Setup. image

✔️ Expected Behavior

Both work fine. The model uses the selected API token.

Custom models can be on different accounts, and since the Add Model screen requires a token, that token should be used and no the "global one"

❌ Actual Behavior

Only the "global" API token is used.

dosubot[bot] commented 2 days ago

I found a similar closed issue that might be relevant: 是否考虑一下支持配置多个 azure openai 密钥的功能,增加通道量 [1].

To configure multiple OpenAI tokens in Dify version 0.8.0 and ensure that each custom model uses its selected API token instead of the global API token, follow these steps:

  1. Navigate to Settings > Model Providers.
  2. Add and configure the necessary model providers by entering the specific API keys for each provider.

This setup allows Dify to use the specified API key for each custom model, ensuring that the correct token is used instead of a global API token [2][3].

If you encounter issues where the global API token is being used instead of the model-specific token, you can use the last_used_at field in the ApiToken class to debug. This field records the last time the token was used, which can help identify which token was used and when [4].

Additionally, there are known limitations regarding token validation. If you encounter an "Invalid token" error, try clearing the browser cache or regenerating a new App URL [5].

By following these steps and using the debugging information, you should be able to ensure that each custom model uses its selected API token.

To continue talking to Dosu, mention @dosu.

crazywoola commented 2 days ago

This is a paid feature, which can be accessed in cloud version. See docs here https://docs.dify.ai/guides/model-configuration/load-balancing.

Also, please use the right template for new features, this is a bug only template.

racerxdl commented 2 days ago

Well, thats a bug since the UI actually allows you to use multiple tokens.

rlaneth commented 1 day ago

@crazywoola, it’s quite astonishing that this issue has been closed without proper consideration. After reviewing the linked documentation, it's clear this is a bug and not something that should’ve been dismissed. I’ve been able to reproduce the issue, so let me break it down for you:

  1. @racerxdl never mentioned the Load Balancing feature, which, as per the documentation you have referred to, is indeed a paid feature in the cloud version. What he actually referred to was the ability to use a custom access token for a specific model, which is clearly available in the self-hosted version (as per @racerxdl's screenshots) but is not functioning correctly.

  2. @racerxdl was right to classify this as a bug and not a feature request. It’s not about adding a feature; it’s about fixing one that’s broken.

Maybe we should tag @goocarlos to come take a look since your reading comprehension seems a bit… compromised, @crazywoola. Or hey, here’s an idea: use an LLM to help you interpret these issues if that’s too hard! If this somehow doesn’t get confirmed as a bug, I’ll just fork the project and fix your broken code myself.