Closed KinkGD closed 1 week ago
Sorry to make a feature request, but it's more this than a bug report...
This extension is wonderful, but i'm experiencing trouble with LM studio and ollama server.
The extension description affirm it's possible to use LM Studio as Ollama server for a privacy based usage.
Question, more related to LM Studio (which i can't find in their doc), how to set environment variable to OLLAMA_ORIGINS = moz-extension://*
OLLAMA_ORIGINS = moz-extension://*
And how to connect ThunderAI to LM server ?
Any answer, link to documentation or tutorial would be appreciated.
Regards.
You have to enable the corresponding checkbox:
In the ThunderAI Options disable the v1 checkbox:
v1
Then set the correct hostname.
Sorry to make a feature request, but it's more this than a bug report...
This extension is wonderful, but i'm experiencing trouble with LM studio and ollama server.
The extension description affirm it's possible to use LM Studio as Ollama server for a privacy based usage.
Question, more related to LM Studio (which i can't find in their doc), how to set environment variable to
OLLAMA_ORIGINS = moz-extension://*
And how to connect ThunderAI to LM server ?
Any answer, link to documentation or tutorial would be appreciated.
Regards.