quarkiverse / quarkus-langchain4j

Quarkus Langchain4j extension
https://docs.quarkiverse.io/quarkus-langchain4j/dev/index.html
Apache License 2.0
148 stars 89 forks source link

required to set chat-model provider when default should work #1055

Closed maxandersen closed 2 weeks ago

maxandersen commented 2 weeks ago

I have this:

///usr/bin/env jbang "$0" "$@" ; exit $?
//DEPS io.quarkus.platform:quarkus-bom:3.15.1@pom
//DEPS io.quarkus:quarkus-picocli
//DEPS io.quarkiverse.langchain4j:quarkus-langchain4j-openai:0.21.0.CR4
//DEPS io.quarkiverse.langchain4j:quarkus-langchain4j-ollama:0.21.0.CR4

//Q:CONFIG quarkus.banner.enabled=false
////Q:CONFIG quarkus.log.level=WARN
import static java.lang.System.out;

import java.util.List;

import dev.langchain4j.model.chat.ChatLanguageModel;
import io.quarkiverse.langchain4j.ModelName;
import jakarta.enterprise.inject.Any;
import jakarta.enterprise.inject.Instance;
import jakarta.inject.Inject;
import picocli.CommandLine.Command;

////Q:CONFIG quarkus.langchain4j.openai.openai.model-id=gpt-4o-mini
////Q:CONFIG quarkus.langchain4j.openai.openai.api-key=${OPENAI_API_KEY}

////Q:CONFIG quarkus.langchain4j.openai.github.base-url=https://models.inference.ai.azure.com
////Q:CONFIG quarkus.langchain4j.openai.github.api-key=${GITHUB_TOKEN}
////Q:CONFIG quarkus.langchain4j.openai.github.chat-model.model-id=Phi-3.5-MoE-instruct

////Q:CONFIG quarkus.langchain4j.openai.podmanai.base-url=http://localhost:51352/v1
////Q:CONFIG quarkus.langchain4j.openai.podmanai.api-key=sk-dummy

////Q:CONFIG quarkus.langchain4j.ollama.granite.api-key=x
//Q:CONFIG quarkus.langchain4j.ollama.granite.model-id=granite3-dense

//Q:CONFIG quarkus.langchain4j.temperature=1.0
//Q:CONFIG quarkus.langchain4j.timeout=10s
////Q:CONFIG quarkus.langchain4j.log-requests=true
////Q:CONFIG quarkus.langchain4j.log-responses=true

@Command
public class jokesmulti implements Runnable {

    @Inject @Any Instance<ChatLanguageModel> instance;
    /* 
    @Inject @ModelName("github")
    private ChatLanguageModel github;

    @Inject @ModelName("openai")
    private ChatLanguageModel openai;

    @Inject @ModelName("podmanai")
    private ChatLanguageModel podmanai;
*/
    @Inject @ModelName("granite")
    private ChatLanguageModel granite;

    @Override
    public void run() {
        List.of(/*"github", "openai", "podmanai", */"granite").forEach((modelName) -> {
            var model = instance.select(ModelName.Literal.of(modelName)).get();
            out.println(modelName+":");
            try {
                out.println(model.generate("tell me a joke"));
            } catch (Exception e) {
                out.println(e.getMessage());
            }
            out.println("--------------------------------");
        });

    }
}

which results in this build error:

Caused by: io.quarkus.runtime.configuration.ConfigurationException: A ChatLanguageModel or StreamingChatLanguageModel bean was requested, but since there are multiple available providers, the 'quarkus.langchain4j.granite.chat-model.provider' needs to be set to one of the available options (openai,ollama).

i'm a bit confused by that error as what is configured by user is:

quarkus.langchain4j.ollama.granite.model-id which has ollama in name and from that the provider should be possible to assume?

geoand commented 2 weeks ago

I think we can do better, I'll have a look

geoand commented 2 weeks ago

With #1057 this will be addressed (assuming you use quarkus.langchain4j.ollama.granite.chat-model.model-id=granite3-dense instead of the incorrect property currently being used)