fbgallet / roam-extension-speech-to-roam

MIT License
8 stars 0 forks source link

Open source or alternative APIs? #2

Closed cori closed 3 months ago

cori commented 4 months ago

I was super psyched to see the direction you were taking this when the commits started to roll into the Depot; thanks for all the work!

I haven't ~installed it yet or~ looked deeply at the code, so forgive me and feel free to tell me to go do so, but I didn't see any mention of alternative OpenAI-API-Compatible API support; is that something you're contemplating or would entertain a PR for (not that I could necessarily submit that myself, but it's not inconceivable ;-) ).

By this I mean instead of https://api.openai.com/v1 use some other OPENAI_BASE_URL - I often use OpenRouter so https://openrouter.ai/api/v1, or if I want tp send something the the LLM running on my local network I could use http://my-great-ollama-server.local:11434/v1 to keep things in my own home network.

I definitely realize this isn't necessarily transparent, and that things like Whisper support wouldn't naturally come along....

Pathsis commented 4 months ago

Would like to be able to support large local models such as ollama running llama3.

fbgallet commented 4 months ago

@cori I've made a quick test with OpenRouter, still using OpenAI client, and it works (see 'Alternative-endpoint' branch or here. It could be added to the next update, with an option to add a selection of models and eventually to use OpenRouter endpoint only.

About local Ollama server, i suppose it would be possible, i've not tested it yet, but it would be great, for sure !

fbgallet commented 4 months ago

@cori @ @Pathsis , the same branch now support Ollama local server. It will be merged in the next update. The only requirement is to manually define the environment variable for CORS Origin.

You will need to enter this command in the terminal and restart ollama serve for it to work. launchctl setenv OLLAMA_ORIGINS "*" (on MacOS)

Pathsis commented 3 months ago

@fbgallet Thanks! It's so exciting to see! Have you tested it on Linux? I'm using Ubuntu, and when the time comes, does running launchctl setenv OLLAMA_ORIGINS "*" work just as well?

cori commented 3 months ago

This sounds great @fbgallet - thanks so much and I can't wait to test it!

@Pathsis if you're running Ollama on Linux it will depend on how you're running it. https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux has more info.

fbgallet commented 3 months ago

The update is submitted for review. I've added some instructions in the extension documentation about the Ollama server configuration. Thanks @cori for the link to Ollama faq

fbgallet commented 3 months ago

@Pathsis @cori The update has been released.