ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://parisneo.github.io/lollms-webui/
Apache License 2.0
4.07k stars 515 forks source link

last update problem #515

Open Sergey0495 opened 3 months ago

Sergey0495 commented 3 months ago

launched an application that has just been updated (5 minutes ago)

Current behavior

information: ip:64352 - "POST /apply_settings HTTP/1.1" 500 Internal server error mistake: An exception in the ASGI application Backtracking (last last call): The file "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py" , line 407, in run_asgi result = waiting for the application( # type: ignore[function-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The file "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py" , line 69, in the call returns the self.app wait(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/engineio/async_drivers/asgi.py ", line 67, in the call expect a self.other_asgi_app(scope, receive, send) file "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/fastapi/applications.py ", line 1054, in the call expect super().calling(scope, receiving, sending) File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/applications.py ", line 123, in the call waiting for self.middleware_stack(scope, receive, send) File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/middleware/errors.py ", line 186, in the call raise the exc File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/middleware/errors.py ", line 164, in the call waiting for a self.app(scope, receive, _send) file "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/middleware/exceptions.py" , line 62, in the call waiting for wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/_exception_handler.py ", line 64, in wrapped_app raise the exc File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/_exception_handler.py ", line 53, in wrapped_app pending application (scope, receipt, sender) File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/routing.py ", line 758, in the call waiting for self.middleware_stack(area, receive, send) File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/routing.py ", line 778, in the appendix waiting for the route.handle(area, receive, send) file "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/routing.py ", line 299, in the descriptor waiting for self.app(scope, receive, send) File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/routing.py ", line 79, in the appendix waiting for wrap_app_handling_exceptions(application, request)(scope, receive, send) File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/_exception_handler.py ", line 64, in wrapped_app raise the exc File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/_exception_handler.py ", line 53, in wrapped_app pending application (scope, receipt, sender) File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/starlette/routing.py ", line 74, in the application response = waiting function(request) ^^^^^^^^^^^^^^^^^^^ File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/fastapi/routing.py ", line 278, in the application raw_response = waiting for the execution of_endpoint_function( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/installer_files/lollms_env/lib/python3.11/site-packages/fastapi/routing.py ", line 191, in run_endpoint_function returns the expectation of dependent.call(**values) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/lollms-webui/lollms_core/lollms/server/endpoints/lollms_configuration_infos.py ", line 145, in apply_settings forbid_remote_access(lollmsElfServer) File "/home/user/lollms-webui/lollms_core/lollms/security.py ", line 51, in forbidden_remote_access raises an exception("This functionality is not allowed if the server is open") Exception: This functionality is disabled if the server is open

changed the settings: " the /configs/local_config file.yaml and changed the following parameters: host: 0.0.0.0 # Allow remote connections port: 9600 # If desired, change the port number (default is 9600) force_accept_remote_access: true # Forcibly accept remote connections

as a result, changes to settings in webui are not applied

there were no problems in the previous version before the update

EmersonBiggons commented 3 months ago

I am also having this issue, no idea what is causing it but everytime I update BOOM "500 internal error."

ParisNeo commented 3 months ago

Hi. I really don't encourage opening lollms this way to the outside. Don't forget, this thing can execute code and you don't want this to be exploitable from the outside. The last upgrade asks for confirmation everytime a configuration apply is requested. So for remote use, you need to deactivate this from the config file.

But as i said, using it like this is risky so you need to make sure that your network is very secured.

The other way to serve lollms is to use it as headless server which deactivates all webui functionalities and becomes a pure generation server. No code execution is allowed. Then you install another lollms on the client and use elf binding to use the server. With this, the client doesn't need to be powerful and in the same time, you protect your server from attacks.

I am sorry for making it complex, but without this someone can use lollms to encrypt all your data and ask for ransome, so i'm doing this to protect the users.