langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
46.88k stars 6.63k forks source link

For model series that do not yet support Function Calling, we provide a universal ReAct method for invocation. #3918

Closed myrulezzz closed 3 months ago

myrulezzz commented 5 months ago

Self Checks

Provide a description of requested docs changes

How to Achieve Agent Reasoning with Various LLMs? To make these AI Agents smart, you need a solid base model for reasoning. That's where LLMs come in, with CoT (Chain of Thought) reasoning capabilities. Different model providers support different ways of making these models think, like 'Function calling' and 'ReAct.' Those that use Function Calling usually work better. Right now, models from OpenAI, ChatGLM, Tongyi, MiniMax, and ERNIE Bot support this Function calling feature.For model series that do not yet support Function Calling, we provide a universal ReAct method for invocation. Can you explain more about the universal ReAct method and invovation? For example LLama3 70b when running with Ollama, is there any functionality that i can use tools and custom tools with it?

dosubot[bot] commented 5 months ago

The universal ReAct method you're inquiring about is designed to enhance agent capabilities within the Dify ecosystem, particularly for models that do not support Function Calling, like LLama3 70b when used with Ollama. ReAct is part of a broader tooling support framework in Dify, which includes the ability to invoke standard tools and directly load OpenAPI Specification APIs as tools. This setup suggests that ReAct can be integrated to work with over 30+ built-in tools available, aiming to provide a versatile method for model invocation and interaction.

For models such as LLama3 70b running on Ollama, Dify offers a customizable model configuration method. This method allows for specifying various parameters to tailor the model's behavior to specific applications, including the mode (completion or chat), context size, max tokens, and whether the model has vision support. This flexibility ensures that even models without native Function Calling support can be effectively utilized within the Dify framework.

While specific implementation details or examples of using the ReAct method with LLama3 70b were not provided, the information suggests that Dify's configuration options are designed to accommodate a wide range of model capabilities. This includes integrating Large Language Models (LLMs) and Text Embedding capabilities for conversational applications and text analysis, even for models deployed through services like Ollama.

In summary, while direct examples of ReAct with LLama3 70b are not detailed, the Dify ecosystem supports a flexible and configurable approach to model integration and invocation. This approach is likely to facilitate the use of ReAct with various models, including those not supporting Function Calling, by leveraging the extensive tooling and configuration options available within Dify.

#### Details **Sources** - [en/getting-started/readme/specifications-and-technical-features.md](https://github.com/langgenius/dify-docs/blob/main/en/getting-started/readme/specifications-and-technical-features.md) - [api/core/model_runtime/model_providers/ollama/ollama.yaml](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/model_providers/ollama/ollama.yaml) - [en/tutorials/model-configuration/ollama.md](https://github.com/langgenius/dify-docs/blob/main/en/tutorials/model-configuration/ollama.md) - [zh_CN/guides/model-configuration/ollama.md](https://github.com/langgenius/dify-docs/blob/main/zh_CN/guides/model-configuration/ollama.md) - [en/tutorials/model-configuration/openllm.md](https://github.com/langgenius/dify-docs/blob/main/en/tutorials/model-configuration/openllm.md) - [en/tutorials/model-configuration/README.md](https://github.com/langgenius/dify-docs/blob/main/en/tutorials/model-configuration/README.md) - [en/getting-started/what-is-llmops.md](https://github.com/langgenius/dify-docs/blob/main/en/getting-started/what-is-llmops.md) - [api/core/model_runtime/model_providers/nvidia/llm/llama3-70b.yaml](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/model_providers/nvidia/llm/llama3-70b.yaml)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

feiwofeifeixiaowo commented 5 months ago

In Agent mode, dify will enable ReAct mode by default for models that do not support function calls, for the purpose of invoking tools.

image

As far as I know, it is currently not possible to edit the ReAct prompts on the webpage, If there is a need, you can manually adjust the content in the following code. ReAct Prompt

You can customize tools from the agent configuration interface, through "tools add" and "custom tool add". Below is the tool I use for testing to obtain the weather.

{
      "openapi": "3.1.0",
      "info": {
        "title": "一个输入城市名称查询天气的函数",
        "description": "一个输入城市名称查询天气的函数",
        "version": "v1.0.0"
      },
      "servers": [
        {
          "url": "http://host.docker.internal:8013"
        }
      ],
      "paths": {
        "/weather": {
          "get": {
            "description": "一个输入城市名称查询天气的函数",
            "operationId": "GetCurrentWeather",
            "parameters": [
              {
                "name": "location",
                "in": "query",
                "description": "一个你想查询的城市名称",
                "required": true,
                "schema": {
                  "type": "string"
                }
              }
            ],
            "deprecated": false
          }
        }
      },
      "components": {
        "schemas": {}
      }
    }
from flask import Flask, request, jsonify

app = Flask(__name__)

# Simulated weather data for example
weather_data = {
    "北京": {"temperature": "36°C", "condition": "Cloudy"},
    "巴黎": {"temperature": "24°C", "condition": "Sunny"},
    "Beijing": {"temperature": "36°C", "condition": "Cloudy"},
    "Paris": {"temperature": "24°C", "condition": "Sunny"},
    "Chicago, IL": {"temperature": "10°C", "condition": "Rainy"}
}

@app.route('/weather', methods=['GET'])
def get_current_weather():
    location = request.args.get('location')
    if location:
        weather_info = weather_data.get(location, "Weather data not available for the specified location.")
        if isinstance(weather_info, str):
            return jsonify({"error": weather_info}), 404
        return jsonify({
            "location": location,
            "temperature": weather_info['temperature'],
            "condition": weather_info['condition']
        })
    else:
        return jsonify({"error": "Location parameter is missing"}), 400
if __name__ == '__main__':
    app.run(debug=False)

Custom Tools in Docs

myrulezzz commented 5 months ago

Ok thanks.So far i used custom tools only and add a swagger file and chatgpt was able to perform task based in n the swagger file description. But if i understand correctly the below code can be used to integrate apis with other llms like llama3 70b correct?

On Sat, 27 Apr 2024 at 16:53 normal @.***> wrote:

In Agent mode, dify will enable ReAct mode by default for models that do not support function calls, for the purpose of invoking tools. image.png (view on web) https://github.com/langgenius/dify/assets/4606937/553bcca4-492c-475a-ba60-a059c1e4fd07

As far as I know, it is currently not possible to edit the ReAct prompts on the webpage, If there is a need, you can manually adjust the content in the following code. ReAct Prompt https://github.com/langgenius/dify/blob/661b30784eadb618e43c04809efbd52b33399637/api/core/tools/prompt/template.py#L1

You can customize tools from the agent configuration interface, through "tools add" and "custom tool add". Below is the tool I use for testing to obtain the weather.

{ "openapi": "3.1.0", "info": { "title": "一个输入城市名称查询天气的函数", "description": "一个输入城市名称查询天气的函数", "version": "v1.0.0" }, "servers": [ { "url": "http://host.docker.internal:8013" } ], "paths": { "/weather": { "get": { "description": "一个输入城市名称查询天气的函数", "operationId": "GetCurrentWeather", "parameters": [ { "name": "location", "in": "query", "description": "一个你想查询的城市名称", "required": true, "schema": { "type": "string" } } ], "deprecated": false } } }, "components": { "schemas": {} } }

from flask import Flask, request, jsonify app = Flask(name)

Simulated weather data for exampleweather_data = {

"北京": {"temperature": "36°C", "condition": "Cloudy"},
"巴黎": {"temperature": "24°C", "condition": "Sunny"},
"Beijing": {"temperature": "36°C", "condition": "Cloudy"},
"Paris": {"temperature": "24°C", "condition": "Sunny"},
"Chicago, IL": {"temperature": "10°C", "condition": "Rainy"}

} @app.route('/weather', methods=['GET'])def get_current_weather(): location = request.args.get('location') if location: weather_info = weather_data.get(location, "Weather data not available for the specified location.") if isinstance(weather_info, str): return jsonify({"error": weather_info}), 404 return jsonify({ "location": location, "temperature": weather_info['temperature'], "condition": weather_info['condition'] }) else: return jsonify({"error": "Location parameter is missing"}), 400if name == 'main': app.run(debug=False)

Custom Tools in Docs https://docs.dify.ai/tutorials/quick-tool-integration

— Reply to this email directly, view it on GitHub https://github.com/langgenius/dify/issues/3918#issuecomment-2080714887, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKIY77NVNONIIVYUVMBXSTDY7OUW5AVCNFSM6AAAAABG34EITOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOBQG4YTIOBYG4 . You are receiving this because you were mentioned.Message ID: @.***>

myrulezzz commented 5 months ago

And also in custom tools you add the flask app or you are using individually on the server eunning in a docler?

On Sat, 27 Apr 2024 at 17:43 Andreas Stylianou @.***> wrote:

Ok thanks.So far i used custom tools only and add a swagger file and chatgpt was able to perform task based in n the swagger file description. But if i understand correctly the below code can be used to integrate apis with other llms like llama3 70b correct?

On Sat, 27 Apr 2024 at 16:53 normal @.***> wrote:

In Agent mode, dify will enable ReAct mode by default for models that do not support function calls, for the purpose of invoking tools. image.png (view on web) https://github.com/langgenius/dify/assets/4606937/553bcca4-492c-475a-ba60-a059c1e4fd07

As far as I know, it is currently not possible to edit the ReAct prompts on the webpage, If there is a need, you can manually adjust the content in the following code. ReAct Prompt https://github.com/langgenius/dify/blob/661b30784eadb618e43c04809efbd52b33399637/api/core/tools/prompt/template.py#L1

You can customize tools from the agent configuration interface, through "tools add" and "custom tool add". Below is the tool I use for testing to obtain the weather.

{ "openapi": "3.1.0", "info": { "title": "一个输入城市名称查询天气的函数", "description": "一个输入城市名称查询天气的函数", "version": "v1.0.0" }, "servers": [ { "url": "http://host.docker.internal:8013" } ], "paths": { "/weather": { "get": { "description": "一个输入城市名称查询天气的函数", "operationId": "GetCurrentWeather", "parameters": [ { "name": "location", "in": "query", "description": "一个你想查询的城市名称", "required": true, "schema": { "type": "string" } } ], "deprecated": false } } }, "components": { "schemas": {} } }

from flask import Flask, request, jsonify app = Flask(name)

Simulated weather data for exampleweather_data = {

"北京": {"temperature": "36°C", "condition": "Cloudy"},
"巴黎": {"temperature": "24°C", "condition": "Sunny"},
"Beijing": {"temperature": "36°C", "condition": "Cloudy"},
"Paris": {"temperature": "24°C", "condition": "Sunny"},
"Chicago, IL": {"temperature": "10°C", "condition": "Rainy"}

} @app.route('/weather', methods=['GET'])def get_current_weather(): location = request.args.get('location') if location: weather_info = weather_data.get(location, "Weather data not available for the specified location.") if isinstance(weather_info, str): return jsonify({"error": weather_info}), 404 return jsonify({ "location": location, "temperature": weather_info['temperature'], "condition": weather_info['condition'] }) else: return jsonify({"error": "Location parameter is missing"}), 400if name == 'main': app.run(debug=False)

Custom Tools in Docs https://docs.dify.ai/tutorials/quick-tool-integration

— Reply to this email directly, view it on GitHub https://github.com/langgenius/dify/issues/3918#issuecomment-2080714887, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKIY77NVNONIIVYUVMBXSTDY7OUW5AVCNFSM6AAAAABG34EITOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOBQG4YTIOBYG4 . You are receiving this because you were mentioned.Message ID: @.***>

feiwofeifeixiaowo commented 5 months ago

sure. The example above is how I tested the llama3 8B groq API, and it also applies to the llama3 70B. In the example, the custom tool essentially makes an HTTP request to its own API. I deployed a Flask app on a MacBook to test the custom tool. The version of Docker I used is deployed locally, so the request link for the custom tool uses host.docker.internal.