piqoni / matcha

Daily Digest Reader
MIT License
454 stars 23 forks source link

feat: allow user to change openai base url and model for LocalAI support #35

Closed barakplasma closed 1 year ago

barakplasma commented 1 year ago

by changing the openai base url, we can use LocalAI to summarize articles instead of ChatGPT. this also lets the user change the model (gpt-4 or gpt-3.5-turbo or openllama)

also bumps the dependencies for: github.com/sashabaranov/go-openai v1.14.2 github.com/spf13/viper v1.16.0

my go linter removed a bunch of trailing spaces by accident


I tested with and without the config for openai_base_url and openai_model to make sure it works for existing configs.

piqoni commented 1 year ago

Very cool, this looks quite useful. I cannot truly test this (I do have llama2 on one machine but dont have LocalAI) but LGTM :) Thanks for the addition!

barakplasma commented 1 year ago

By the way, this works with litellm openai proxy and ollama + mistralai as well

krrishdholakia commented 1 year ago

Hey @barakplasma any tweaks required for litellm proxy besides the int(time.time) patch?

barakplasma commented 1 year ago

@krr

Hey @barakplasma any tweaks required for litellm proxy besides the int(time.time) patch?

I don't think any other changes are needed. In my local patch I wrapped all the other occurrences of time time with int, but doing it at the top of main py should be enough