langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
43.01k stars 5.97k forks source link

Invoke AWS Bedrock Mistral model have issues #6323

Closed SIN1109 closed 1 month ago

SIN1109 commented 1 month ago

Self Checks

Dify version

0.6.13

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Create chatbot agent in Studio without any PE, just type in "hello" in the chat. It will response errror:

[bedrock] Error: An error occurred (validationException) when calling the InvokeModelWithResponseStream operation: Validation Error

It will have issues for both

mistral.mistral-7b-instruct-v0:2 mistral.mixtral-8x7b-instruct-v0:1

Also tested Claude3 Sonnet and Titan amazon.titan-text-express-v1 without issues.

The AK/SK and region also been validated.

✔️ Expected Behavior

Can use Mistral model in dify.ai via bedrock integrations

❌ Actual Behavior

image

dosubot[bot] commented 1 month ago

@SIN1109 Hello there! I'm here to help you with any bugs, questions, or contributions you have for the repository. Let's tackle this issue together!

To resolve the validationException error when invoking the AWS Bedrock Mistral model using the Dify platform, ensure the following:

  1. Correct Credentials: Verify that the AWS credentials (Access Key, Secret Access Key, and Region) are correct and have the necessary permissions.

  2. Model Parameters: Ensure that the parameters for the Mistral model are correctly set. The required parameters for the Mistral model are:

    • temperature: Optional, default is 0.5
    • top_p: Optional, default is 0.9
    • top_k: Optional, default is 50, with a maximum of 200
    • max_tokens: Required, default is 512, with a minimum of 1 and a maximum of 8192
    • context_size: 32000
  3. Configuration File: Ensure that the bedrock.yaml configuration file is correctly set up with the necessary environment variables and model parameters.

  4. Validation Method: The validate_provider_credentials method in the MistralAIProvider class is responsible for validating the credentials. Ensure that this method is correctly implemented and that the credentials are valid.

Here is a relevant code snippet for validating the credentials:

class MistralAIProvider(ModelProvider):

    def validate_provider_credentials(self, credentials: dict) -> None:
        try:
            model_instance = self.get_model_instance(ModelType.LLM)
            model_instance.validate_credentials(
                model='open-mistral-7b',
                credentials=credentials
            )
        except CredentialsValidateFailedError as ex:
            raise ex
        except Exception as ex:
            logger.exception(f'{self.get_provider_schema().provider} credentials validate failed')
            raise ex

Additionally, ensure that the InvokeModelWithResponseStream operation is correctly handled in the Dify platform. The payload for the Mistral model should be correctly formatted and passed to the AWS Bedrock runtime client.

If the credentials and parameters are correct and you are still encountering the error, you may need to debug further by logging the exact exception message or checking the implementation of the validate_credentials method in the model instance [1][2][3][4].

To continue talking to Dosu, mention @dosu.

longzhihun commented 1 month ago

the issue was fixed in main branch, please have a look.

crazywoola commented 1 month ago

Link https://github.com/langgenius/dify/pull/6299