your page-summarizer extension looks very cool and handy, so I was wondering if you'd consider making the API endpoint configurable.
Background:
Ollama is a project that enables users to run LLMs locally on their own computers.
A couple of days ago, v0.1.24 gained compatibility with the OpenAI Chat Completions API, which (in theory) means that projects like page-summarizer only need to update their config (mostly API endpoint and models) and everything should "just work" :tm:
Hi Jeff,
your page-summarizer extension looks very cool and handy, so I was wondering if you'd consider making the API endpoint configurable.
Background: Ollama is a project that enables users to run LLMs locally on their own computers. A couple of days ago, v0.1.24 gained compatibility with the OpenAI Chat Completions API, which (in theory) means that projects like page-summarizer only need to update their config (mostly API endpoint and models) and everything should "just work" :tm:
Thanks, and take care!