Closed adriens closed 2 months ago
:star_struck: indeed, it does :heavy_heart_exclamation:
quarkus.langchain4j.moderation-model.provider=openai
quarkus.langchain4j.openai.api-key=test1
quarkus.langchain4j.c1.chat-model.provider=openai
quarkus.langchain4j.c1.moderation-model.provider=openai
quarkus.langchain4j.c1.embedding-model.provider=azure-openai
quarkus.langchain4j.openai.c1.api-key=test2
quarkus.langchain4j.azure-openai.c1.resource-name=res
quarkus.langchain4j.azure-openai.c1.deployment-name=dep
quarkus.langchain4j.azure-openai.c1.api-key=test
quarkus.langchain4j.c2.chat-model.provider=azure-openai
quarkus.langchain4j.c2.embedding-model.provider=azure-openai
quarkus.langchain4j.azure-openai.c2.resource-name=res
quarkus.langchain4j.azure-openai.c2.deployment-name=dep
quarkus.langchain4j.azure-openai.c2.api-key=test3
quarkus.langchain4j.c3.chat-model.provider=huggingface
quarkus.langchain4j.huggingface.c3.api-key=test4
quarkus.langchain4j.c4.chat-model.provider=bam
quarkus.langchain4j.c4.moderation-model.provider=bam
quarkus.langchain4j.bam.c4.api-key=test5
quarkus.langchain4j.c5.chat-model.provider=ollama
quarkus.langchain4j.c6.chat-model.provider=openshift-ai
quarkus.langchain4j.openshift-ai.c6.base-url=https://somecluster.somedomain.ai:443/api
quarkus.langchain4j.openshift-ai.c6.chat-model.model-id=somemodel
quarkus.langchain4j.c7.chat-model.provider=watsonx
quarkus.langchain4j.watsonx.c7.base-url=https://somecluster.somedomain.ai:443/api
quarkus.langchain4j.watsonx.c7.api-key=test8
quarkus.langchain4j.watsonx.c7.project-id=proj
quarkus.langchain4j.e1.embedding-model.provider=openai
quarkus.langchain4j.openai.e1.api-key=test5
quarkus.langchain4j.e2.embedding-model.provider=ollama
quarkus.langchain4j.e3.embedding-model.provider=dev.langchain4j.model.embedding.AllMiniLmL6V2EmbeddingModel
Thanks a lot for the help, it saved us a lot of time :pray:
🥳
:grey_question: About
We are currently building a single app that will have to rely on :
ollama
As for some tasks, we feel more confident with
openai
for function calling or structured outputs.In the samples you provide we did not find how to point from configuration/annotation how to tell which LLM should be targettet.
:grey_question: Question