Open nevenc opened 8 months ago
Fair point. The intent of the project was to show how the different libraries compare, so I haven't focused too much on making it easy to change models, so like you said, you'd need to do code changes in the LangChain4j example if you want to use ollama.
Need to look into it. Do you have ideas on how we could make the langchain code support easier swapping of LLMs?
I think the easiest way to do it would be swapping spring profiles (Ex: "open-ai", "ollama"). Default could be "open-ai"
First of all - Marcus - thanks for putting this demo application - I loved it! It makes an exciting demo of technologies!
I agree with @ygoron360 - probably best with profiles, the easiest something like this:
@Profile("openai")
@Configuration
public class OpenAiConfiguration {
...
}
@Profile("ollama")
@Configuration
public class OllamaConfiguration {
...
}
I want to be able to run this application locally with my LLM, such as Ollama.
The application now requires
openai.api.key
inLangChain4jConfig
. No way to skip the langchain4j-openai creation or provide an alternative langchain4j-ollamaIn Spring AI it's easy to use just a different starter and app works without the change.
spring-ai-ollama-spring-boot-starter
instead ofspring-ai-openai-spring-boot-starter