Mintplex-Labs / anythingllm-docs

Documentation of AnythingLLM by Mintplex Labs Inc.
https://docs.anythingllm.com
MIT License
104 stars 27 forks source link

[Guide] How to access AnythingLLM (Local - Docker) from different devices #21

Open ShadowArcanist opened 4 months ago

ShadowArcanist commented 4 months ago

On our discord people often ask "How can I access AnythingLLM from my mobile phone or different machine"

Users can access AnythingLLM by exposing their Localhost to the internet by using something like NGrok or Cloudlfare tunnel but there is no guide or tutorials on youtube so it is good to write a guide about this

timothycarambat commented 4 months ago

My only reservation on this is it is not a really "safe" way to run AnythingLLM in a more "server-like" container. They should really use container cloud services like Render or Railway because if they tunnel their local computer they are basically opening a tunnel to their desktop computer and if the computer goes to sleep the application wont work anywa

ShadowArcanist commented 4 months ago

@timothycarambat I don't have much knowledge in this area 😄, I saw alot of people asking on how to do it so maybe we can write the guide on how to do it and add a warning message at the beginning of the guide.

A red warning/caution like this: image

What do you think about this approach? 🤔

timothycarambat commented 4 months ago

That provider is different from the primary issue that this is under. That LLM is basically an "open" OpenAI provider that basically allows you to use providers we don yet have built-in support for. But yes, that warning would be appropriate