CSLukkun / ob_daily_summary

MIT License
4 stars 1 forks source link

WIP: Add ollama support #3

Closed HighPriest closed 16 hours ago

HighPriest commented 23 hours ago

To run ollama models, all that we need is:

If no model is entered, OpenAI API is used instead.

I now recall, that openAI API also allows to choose between different APIs, so this automatic selection has to be replaced with a toggle, or some other, precise, way to distinguish if user wants to use oLLama or openAI

HighPriest commented 23 hours ago

I am setting this PR temporarily to WIP. Got to extend the configuration with discrete distinction between running OpenAI & oLLama.