langchain4j / langchain4j

Java version of LangChain
https://docs.langchain4j.dev
Apache License 2.0
4.82k stars 955 forks source link

Local LLM supports LANGCHIAN function calls #1134

Closed leleZeng closed 5 months ago

leleZeng commented 5 months ago

I deployed LlaMa3 using DJL locally. How can Langchian4j support Function calls.

dosubot[bot] commented 5 months ago

To enable LangChain4j to support function calls after deploying LlaMa3 using DJL locally, you need to configure the tools that the language model can use. Here’s how you can do it:

  1. Define your tools by annotating methods with @Tool:
import dev.langchain4j.agent.tool.Tool;

public class MyTools {

    @Tool(name = "exampleTool")
    public String exampleTool(String input) {
        // Your tool logic here
        return "Processed: " + input;
    }
}
  1. Configure the AI service to use these tools:
import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.service.AiServices;

public class MyAiService {

    public static void main(String[] args) {
        ChatLanguageModel model = ... // Initialize your LlaMa3 model using DJL

        MyTools tools = new MyTools();

        Assistant assistant = AiServices.builder(Assistant.class)
                .chatLanguageModel(model)
                .tools(tools)
                .build();

        String response = assistant.chat("Hello");
        System.out.println(response);
    }

    interface Assistant {
        String chat(String userMessage);
    }
}

In this example:

Additionally, LangChain4j provides a FunctionCallHelper class to handle function calls and tool execution requests. This class includes methods to convert tool execution requests to function calls and vice versa, ensuring that the language model can call the defined tools during execution.

#### Details **Sources** - [[FEATURE] llm work with tool using function-calling](https://github.com/langchain4j/langchain4j/issues/86) - [langchain4j/src/main/java/dev/langchain4j/service/AiServices.java](https://github.com/langchain4j/langchain4j/blob/main/langchain4j/src/main/java/dev/langchain4j/service/AiServices.java) - [langchain4j-vertex-ai-gemini/src/main/java/dev/langchain4j/model/vertexai/FunctionCallHelper.java](https://github.com/langchain4j/langchain4j/blob/main/langchain4j-vertex-ai-gemini/src/main/java/dev/langchain4j/model/vertexai/FunctionCallHelper.java) - [langchain4j-bom/pom.xml](https://github.com/langchain4j/langchain4j/blob/main/langchain4j-bom/pom.xml) - [langchain4j-anthropic/src/main/java/dev/langchain4j/model/anthropic/AnthropicStreamingChatModel.java](https://github.com/langchain4j/langchain4j/blob/main/langchain4j-anthropic/src/main/java/dev/langchain4j/model/anthropic/AnthropicStreamingChatModel.java)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

langchain4j commented 5 months ago

Hi, you can use LocalAI, here is an example how you cna create the model: https://github.com/langchain4j/langchain4j-examples/pull/82/files Here is an example how to use tools: https://docs.langchain4j.dev/tutorials/ai-services/#tools-function-calling