mrseanryan / gpt-docs-chat

Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
MIT License
1 stars 0 forks source link

for performance: try host llm on ec2 - docs and indexes stay local #3

Open mrseanryan opened 6 months ago

mrseanryan commented 6 months ago

can ollama URL be configured to point to remote box?

or try use ssh tunnel to make remote ollama appear to be local

mrseanryan commented 6 months ago

try existing webui

https://github.com/open-webui/open-webui