Closed potofo closed 1 month ago
To resolve the issue of registering the Azure OpenAI text-embedding-3-large
model under a corporate proxy, given the error code 400 indicating that the embeddings operation does not work with the specified model, you need to ensure that the base_model_name
specified in the credentials is valid and supported for embedding operations.
From the provided code, the validate_credentials
method checks if the base_model_name
is valid:
def validate_credentials(self, model: str, credentials: dict) -> None:
if 'openai_api_base' not in credentials:
raise CredentialsValidateFailedError('Azure OpenAI API Base Endpoint is required')
if 'openai_api_key' not in credentials:
raise CredentialsValidateFailedError('Azure OpenAI API key is required')
if 'base_model_name' not in credentials:
raise CredentialsValidateFailedError('Base Model Name is required')
if not self._get_ai_model_entity(credentials['base_model_name'], model):
raise CredentialsValidateFailedError(f'Base Model Name {credentials["base_model_name"]} is invalid')
try:
credentials_kwargs = self._to_credential_kwargs(credentials)
client = AzureOpenAI(**credentials_kwargs)
self._embedding_invoke(
model=model,
client=client,
texts=['ping'],
extra_model_kwargs={}
)
except Exception as ex:
raise CredentialsValidateFailedError(str(ex))
The _get_ai_model_entity
method is used to verify if the base_model_name
is valid:
@staticmethod
def _get_ai_model_entity(base_model_name: str, model: str) -> AzureBaseModel:
for ai_model_entity in EMBEDDING_BASE_MODELS:
if ai_model_entity.base_model_name == base_model_name:
ai_model_entity_copy = copy.deepcopy(ai_model_entity)
ai_model_entity_copy.entity.model = model
ai_model_entity_copy.entity.label.en_US = model
ai_model_entity_copy.entity.label.zh_Hans = model
return ai_model_entity_copy
return None
Ensure that the base_model_name
you are using is listed in EMBEDDING_BASE_MODELS
. The valid base_model_name
values listed in EMBEDDING_BASE_MODELS
for Azure OpenAI text embedding models are:
If the base_model_name
is valid and you are still encountering issues, you may need to check the proxy settings and ensure that the network configurations are correctly set up for the Docker environment. Dify handles network configurations and proxy settings for self-hosted Docker environments using the ssrf_proxy
service and environment variables for proxy settings [3][4][5].
To continue talking to Dosu, mention @dosu.
Self Checks
Dify version
0.6.11
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Unable to register Azure OpenAI text-embedding-3-large model under corporate proxy. Does anyone know how to isolate and solve the issues?
1. Add corporate proxy to Docker Deamon
2. Add corporate proxy to Docker Client
3. Add Text Embedding Model Provider [Dify]->[Settings]->[Model Provider]->[Azure OpenAI Service]->[Add model] The Azure OpenAI embedding models added are as follows:
Deployment Name, API Endpoint URL, API Key, API Version, and Base Model are proven models without corporate proxies.
However, adding the text-embedding-3-large model fails with the following error.
Error code: 400 - {'error': {'code': 'Operation Not Supported', 'message': The embeddings operation does not work with the specified model, gpt-4. Please choose different model and try again. You Can learn more about which models can beused with each operation here: https://go.microsoft.com/fwIink/?Iinkid=2197993.'}}