datacrystals / AIStoryWriter

LLM story writer with a focus on high-quality long output based on a user provided prompt.
GNU Affero General Public License v3.0
63 stars 17 forks source link

Remove OLLAMA Host Requirement #36

Closed datacrystals closed 4 months ago

datacrystals commented 4 months ago

We need to make it so that the ollama host is not required and the system can be used without ollama.

LoggeL commented 4 months ago

It would be better in general to make every provider optional and only load the requirements when we detect a model of the given provider.

datacrystals commented 4 months ago

Agreed - I was thinking that instead of having a host at all, it could be baked into the model 'name' E.g OLLAMA on localhost could be: ollama://localhost:11434/username/model, and google could be: google://gemini-1.5-flash or something.

Are there any issues that I'm not thinking of with that scheme?

LoggeL commented 4 months ago

I like that approach. That should also work with open router as far as I see. I can implement that today if you'd like.

datacrystals commented 4 months ago

Yes, please! Thanks for your help.

On Wed, Jul 3, 2024, 05:58 Logge @.***> wrote:

I like that approach. That should also work with open router as far as I see. I can implement that today if you'd like.

— Reply to this email directly, view it on GitHub https://github.com/datacrystals/AIStoryWriter/issues/36#issuecomment-2206019330, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXHV5LV2BH73UHHNVBV5EDZKPYQFAVCNFSM6AAAAABKIJRNN6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMBWGAYTSMZTGA . You are receiving this because you authored the thread.Message ID: @.***>