💜 The best free Telegram bot for ChatGPT, Microsoft Copilot (aka Bing AI / Sidney / EdgeGPT), Microsoft Copilot Designer (aka BingImageCreator), Gemini and Groq with stream writing, requests with images, multiple languages, admin control, data logging and more!
[2024-02-26 15:14:54] [25 ] [INFO ] Logging setup is complete for current process
[2024-02-26 15:14:55] [25 ] [INFO ] Loading /app/config/users.json
[2024-02-26 15:14:55] [25 ] [INFO ] Loaded json from /app/config/users.json
[2024-02-26 15:14:55] [25 ] [INFO ] Saving to /app/config/users.json
[2024-02-26 15:14:55] [25 ] [INFO ] Loading /app/config/users.json
[2024-02-26 15:14:55] [25 ] [INFO ] Loaded json from /app/config/users.json
[2024-02-26 15:14:55] [25 ] [INFO ] Saving to /app/config/users.json
[2024-02-26 15:14:55] [25 ] [INFO ] Initializing ChatGPT module with proxy http://192.168.3.28:7890
[2024-02-26 15:14:55] [25 ] [INFO ] Initializing ChatGPT module with API type 1
[2024-02-26 15:14:56] [25 ] [ERROR ] Error processing request!
Traceback (most recent call last):
File "PyInstaller/loader/pyimod03_ctypes.py", line 53, in __init__
File "ctypes/__init__.py", line 374, in __init__
OSError: /tmp/_MEIBhR5Em/tls_client/dependencies/tls-client-x86.so: cannot open shared object file: No such file or director
y
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "QueueHandler.py", line 303, in _request_processor
File "ChatGPTModule.py", line 114, in initialize
File "ChatGPTModule.py", line 82, in initialize
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "PyInstaller/loader/pyimod02_importers.py", line 419, in exec_module
File "revChatGPT/V1.py", line 37, in <module>
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "PyInstaller/loader/pyimod02_importers.py", line 419, in exec_module
File "OpenAIAuth.py", line 6, in <module>
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "PyInstaller/loader/pyimod02_importers.py", line 419, in exec_module
File "tls_client/__init__.py", line 15, in <module>
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "PyInstaller/loader/pyimod02_importers.py", line 419, in exec_module
File "tls_client/sessions.py", line 1, in <module>
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "PyInstaller/loader/pyimod02_importers.py", line 419, in exec_module
File "tls_client/cffi.py", line 20, in <module>
File "ctypes/__init__.py", line 452, in LoadLibrary
File "PyInstaller/loader/pyimod03_ctypes.py", line 55, in __init__
pyimod03_ctypes.install.<locals>.PyInstallerImportError: Failed to load dynlib/dll '/tmp/_MEIBhR5Em/tls_client/dependencies/
tls-client-x86.so'. Most likely this dynlib/dll was not found when the application was frozen.
[2024-02-26 15:14:56] [7 ] [INFO ] Trying to kill process with PID 25
[2024-02-26 15:14:56] [7 ] [INFO ] Setting prevent_shutdown_flag
[2024-02-26 15:14:56] [7 ] [INFO ] Killed? True
[2024-02-26 15:14:56] [7 ] [INFO ] Container with id 696025909 (PID 25) was removed from the queue
Run in docker.
System is Ubuntu 22.04.1 LTS (GNU/Linux 5.15.0-97-generic x86_64).
I cant understand the error information. What causes this?
Run in docker.
System is Ubuntu 22.04.1 LTS (GNU/Linux 5.15.0-97-generic x86_64).
I cant understand the error information. What causes this?