Open tedraymond opened 4 months ago
Hi! PrivateGPT is not supported directly, but as described in the #23 issue, it may work via a PHP proxy. The difference compared to Ollama is that the port is different; instead of 11434
, it's 8001
. So replacing this line:
$ollama_url = 'http://localhost:11434/v1/chat/completions';
to this:
$ollama_url = 'http://localhost:8001/v1/chat/completions';
may work – not guaranteed, but it worth a try, based on the Python SDK for PrivateGPT API.
Using custom local LLM based on privateGPT, works well in VSCode and Grafana but notepad++ plugin seems to struggle, has anyone else tried this?