alexandreroman / news-factory

Create AI-generated newsletters with Spring AI
Apache License 2.0
6 stars 5 forks source link

Add options to run the app with a local LLM #5

Open alexandreroman opened 4 months ago

alexandreroman commented 4 months ago

Right now the app relies on a set of public LLM such as OpenAI or Bedrock. As a result, the app requires Internet connectivity, as well as a billable account in some cases.

Running an LLM locally (like ollama) would allow the application to be used on a development workstation.

nevenc commented 4 months ago

I've added the pull request for Ollama. I used llama2 but default is mistral. The behaviour is quite similar for the given prompts.

alexandreroman commented 4 months ago

That's great, thanks for your contribution! We're about to merge into main some work done as part of issue #2 (see branch features/multi-ai-providers-all-mvn-deps for details).

Could you please rebase your work on commit 6caca721e632853dea44bdca038fc54197283f42? Then we should be able to merge your work on the updated main branch in the coming days.

alexandreroman commented 4 months ago

The branch features/multi-ai-providers-all-mvn-deps has been merged into main (since #2). You should be able to rebase your work now.

nevenc commented 3 months ago

Added the support for Ollama in this PR