marcushellberg / java-ai-playground

MIT License
281 stars 90 forks source link

I want to run this locally with my LLM, such as Ollama. #4

Open nevenc opened 8 months ago

nevenc commented 8 months ago

I want to be able to run this application locally with my LLM, such as Ollama.

The application now requires openai.api.key in LangChain4jConfig. No way to skip the langchain4j-openai creation or provide an alternative langchain4j-ollama

In Spring AI it's easy to use just a different starter and app works without the change. spring-ai-ollama-spring-boot-starter instead of spring-ai-openai-spring-boot-starter

marcushellberg commented 8 months ago

Fair point. The intent of the project was to show how the different libraries compare, so I haven't focused too much on making it easy to change models, so like you said, you'd need to do code changes in the LangChain4j example if you want to use ollama.

Need to look into it. Do you have ideas on how we could make the langchain code support easier swapping of LLMs?

ygoron360 commented 8 months ago

I think the easiest way to do it would be swapping spring profiles (Ex: "open-ai", "ollama"). Default could be "open-ai"

nevenc commented 8 months ago

First of all - Marcus - thanks for putting this demo application - I loved it! It makes an exciting demo of technologies!

I agree with @ygoron360 - probably best with profiles, the easiest something like this:

@Profile("openai")
@Configuration
public class OpenAiConfiguration {
    ...
}

@Profile("ollama")
@Configuration
public class OllamaConfiguration {
    ...
}
edeandrea commented 2 weeks ago

@nevenc just FYI we just added a Quarkus version on the quarkus branch and you can do exactly what you're looking to do.

The README describes how to do that.