langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
52.78k stars 7.72k forks source link

LocalAI configuration embedded model is invalid #2814

Closed wangrg closed 8 months ago

wangrg commented 8 months ago

Self Checks

Dify version

0.5.9

Cloud or Self Hosted

Self Hosted (Source)

Steps to reproduce

The reason is that the get_customizable_model_schema function is not implemented

✔️ Expected Behavior

Implement the get_customizable_model_schema function

❌ Actual Behavior

LocalAI configuration embedded model is invalid

dosubot[bot] commented 8 months ago

Hey @wangrg, good to see you diving into the depths again! How's everything going on your end?

Based on your request, it seems you're looking to implement the get_customizable_model_schema function to resolve the issue of the LocalAI configuration embedded model being invalid in the self-hosted version 0.5.9 of Dify. Here's a step-by-step guide on how you can achieve this:

  1. Understand the Configuration Schema: The configuration schema includes model name, completion type (with options for completion and chat completion), and server URL. These are crucial for interacting with the LocalAI model.

  2. Define the Function: The get_customizable_model_schema function should dynamically generate a schema based on the supported model types and configuration methods defined in the YAML file. This function will be used to validate and guide the user through setting up a model in Dify.

  3. Implementing the Function:

    • The function should read the supported_model_types and configurate_methods from the YAML file to ensure that it only offers options that are compatible with the LocalAI provider.
    • It should then dynamically construct a JSON schema (or a similar structure) that reflects the model_credential_schema defined in the YAML. This includes handling different types of inputs (e.g., select, text-input) and conditions (e.g., showing completion type based on the model type selected).

Here's a sample code snippet that might help:

import yaml

def get_customizable_model_schema(yaml_path="api/core/model_runtime/model_providers/localai/localai.yaml"):
    with open(yaml_path, 'r') as file:
        localai_config = yaml.safe_load(file)

    # Assuming the structure of the YAML file is consistent and well-defined
    model_credential_schema = localai_config.get('model_credential_schema', {})

    # Construct the schema based on the YAML configuration
    schema = {
        "type": "object",
        "properties": {
            "model": {
                "type": "string",
                "title": model_credential_schema['model']['label']['en_US'],
                "placeholder": model_credential_schema['model']['placeholder']['en_US']
            }
        },
        "required": ["model"]
    }

    # Dynamically add other properties based on the credential form schemas
    for item in model_credential_schema.get('credential_form_schemas', []):
        prop = {
            item['variable']: {
                "type": "string" if item['type'] == 'text-input' else "select",
                "title": item['label']['en_US'],
                "placeholder": item.get('placeholder', {}).get('en_US', ''),
                "required": item.get('required', False)
            }
        }
        if item['type'] == 'select':
            prop[item['variable']]['options'] = [{option['value']: option['label']['en_US']} for option in item['options']]

        schema["properties"].update(prop)
        if item.get('required', False):
            schema["required"].append(item['variable'])

    return schema

# This function would then be used to generate a schema for UI rendering or validation
custom_schema = get_customizable_model_schema()
print(custom_schema)

This code snippet opens and reads the localai.yaml file, then constructs a schema based on the model_credential_schema section. It dynamically handles different input types and requirements. This approach allows for flexibility and adaptability to changes in the YAML configuration without needing to hard-code the schema in the application logic.

You can find the localai.yaml file in the Dify repository here.

Please let me know if you need further assistance or clarification.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
wangrg commented 8 months ago

here: api/core/model_runtime/model_providers/localai/text_embedding/text_embedding.py

liu3xing3long commented 8 months ago

still not working for (dify-api:0.5.8), localai updated to master branch;

image image image

liu3xing3long commented 8 months ago

tried embedding models: jinaai/jina-embeddings-v2-base-zh + all-MiniLM-L6-v2, neither work, both curl-interfaces work

Yeuoly commented 8 months ago

fixed #2840