Chatbot for documentation, that allows you to chat with your data. Privately deployable, provides AI knowledge sharing and integrates knowledge into your AI workflow
I'm trying to install the DocsGPTusing docker container but while running backend it gave me error
[2023-09-23 04:43:18 +0000] [1] [INFO] Starting gunicorn 20.1.0 2023-09-23 10:13:18 [2023-09-23 04:43:18 +0000] [1] [INFO] Listening at: http://0.0.0.0:7091 (1) 2023-09-23 10:13:18 [2023-09-23 04:43:18 +0000] [1] [INFO] Using worker: sync 2023-09-23 10:13:18 [2023-09-23 04:43:18 +0000] [7] [INFO] Booting worker with pid: 7 2023-09-23 10:13:18 [2023-09-23 04:43:18 +0000] [8] [INFO] Booting worker with pid: 8 2023-09-23 10:13:42 [nltk_data] Error loading punkt: <urlopen error EOF occurred in 2023-09-23 10:13:42 [nltk_data] violation of protocol (_ssl.c:1007)> 2023-09-23 10:13:42 [nltk_data] Error loading punkt: <urlopen error EOF occurred in 2023-09-23 10:13:42 [nltk_data] violation of protocol (_ssl.c:1007)> 2023-09-23 10:14:03 [nltk_data] Error loading averaged_perceptron_tagger: <urlopen error 2023-09-23 10:14:03 [nltk_data] EOF occurred in violation of protocol (_ssl.c:1007)> 2023-09-23 10:14:03 [nltk_data] Error loading averaged_perceptron_tagger: <urlopen error 2023-09-23 10:14:03 [nltk_data] EOF occurred in violation of protocol (_ssl.c:1007)> 2023-09-23 10:14:03 None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. 2023-09-23 10:14:03 None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. 2023-09-23 10:14:03 [2023-09-23 04:44:03 +0000] [8] [ERROR] Exception in worker process 2023-09-23 10:14:03 Traceback (most recent call last): 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 259, in hf_raise_for_status 2023-09-23 10:14:03 response.raise_for_status() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status 2023-09-23 10:14:03 raise HTTPError(http_error_msg, response=self) 2023-09-23 10:14:03 requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/openai_chat/resolve/main/tokenizer_config.json 2023-09-23 10:14:03 2023-09-23 10:14:03 The above exception was the direct cause of the following exception: 2023-09-23 10:14:03 2023-09-23 10:14:03 Traceback (most recent call last): 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 417, in cached_file 2023-09-23 10:14:03 resolved_file = hf_hub_download( 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2023-09-23 10:14:03 return fn(*args, **kwargs) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1195, in hf_hub_download 2023-09-23 10:14:03 metadata = get_hf_file_metadata( 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2023-09-23 10:14:03 return fn(*args, **kwargs) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1541, in get_hf_file_metadata 2023-09-23 10:14:03 hf_raise_for_status(r) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 291, in hf_raise_for_status 2023-09-23 10:14:03 raise RepositoryNotFoundError(message, response) from e 2023-09-23 10:14:03 huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-650e6d21-777f5e8b4ffc1ee138437055) 2023-09-23 10:14:03 2023-09-23 10:14:03 Repository Not Found for url: https://huggingface.co/openai_chat/resolve/main/tokenizer_config.json. 2023-09-23 10:14:03 Please make sure you specified the correctrepo_idandrepo_type. 2023-09-23 10:14:03 If you are trying to access a private or gated repo, make sure you are authenticated. 2023-09-23 10:14:03 Invalid username or password. 2023-09-23 10:14:03 2023-09-23 10:14:03 During handling of the above exception, another exception occurred: 2023-09-23 10:14:03 2023-09-23 10:14:03 Traceback (most recent call last): 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/arbiter.py", line 589, in spawn_worker 2023-09-23 10:14:03 worker.init_process() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/workers/base.py", line 134, in init_process 2023-09-23 10:14:03 self.load_wsgi() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/workers/base.py", line 146, in load_wsgi 2023-09-23 10:14:03 self.wsgi = self.app.wsgi() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/base.py", line 67, in wsgi 2023-09-23 10:14:03 self.callable = self.load() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/wsgiapp.py", line 58, in load 2023-09-23 10:14:03 return self.load_wsgiapp() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/wsgiapp.py", line 48, in load_wsgiapp 2023-09-23 10:14:03 return util.import_app(self.app_uri) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/util.py", line 359, in import_app 2023-09-23 10:14:03 mod = importlib.import_module(module) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/importlib/__init__.py", line 126, in import_module 2023-09-23 10:14:03 return _bootstrap._gcd_import(name[level:], package, level) 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 1050, in _gcd_import 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 1027, in _find_and_load 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 688, in _load_unlocked 2023-09-23 10:14:03 File "<frozen importlib._bootstrap_external>", line 883, in exec_module 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed 2023-09-23 10:14:03 File "/app/application/wsgi.py", line 1, in <module> 2023-09-23 10:14:03 from application.app import app 2023-09-23 10:14:03 File "/app/application/app.py", line 58, in <module> 2023-09-23 10:14:03 tokenizer = AutoTokenizer.from_pretrained(model_id) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 643, in from_pretrained 2023-09-23 10:14:03 tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 487, in get_tokenizer_config 2023-09-23 10:14:03 resolved_config_file = cached_file( 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 433, in cached_file 2023-09-23 10:14:03 raise EnvironmentError( 2023-09-23 10:14:03 OSError: openai_chat is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' 2023-09-23 10:14:03 If this is a private repository, make sure to pass a token having permission to this repo withuse_auth_tokenor log in withhuggingface-cli loginand passuse_auth_token=True. 2023-09-23 10:14:03 [2023-09-23 04:44:03 +0000] [8] [INFO] Worker exiting (pid: 8) 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [7] [ERROR] Exception in worker process 2023-09-23 10:14:04 Traceback (most recent call last): 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 259, in hf_raise_for_status 2023-09-23 10:14:04 response.raise_for_status() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status 2023-09-23 10:14:04 raise HTTPError(http_error_msg, response=self) 2023-09-23 10:14:04 requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/openai_chat/resolve/main/tokenizer_config.json 2023-09-23 10:14:04 2023-09-23 10:14:04 The above exception was the direct cause of the following exception: 2023-09-23 10:14:04 2023-09-23 10:14:04 Traceback (most recent call last): 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 417, in cached_file 2023-09-23 10:14:04 resolved_file = hf_hub_download( 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2023-09-23 10:14:04 return fn(*args, **kwargs) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1195, in hf_hub_download 2023-09-23 10:14:04 metadata = get_hf_file_metadata( 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2023-09-23 10:14:04 return fn(*args, **kwargs) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1541, in get_hf_file_metadata 2023-09-23 10:14:04 hf_raise_for_status(r) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 291, in hf_raise_for_status 2023-09-23 10:14:04 raise RepositoryNotFoundError(message, response) from e 2023-09-23 10:14:04 huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-650e6d21-3e161a0a3a20fae47eac6886) 2023-09-23 10:14:04 2023-09-23 10:14:04 Repository Not Found for url: https://huggingface.co/openai_chat/resolve/main/tokenizer_config.json. 2023-09-23 10:14:04 Please make sure you specified the correctrepo_idandrepo_type. 2023-09-23 10:14:04 If you are trying to access a private or gated repo, make sure you are authenticated. 2023-09-23 10:14:04 Invalid username or password. 2023-09-23 10:14:04 2023-09-23 10:14:04 During handling of the above exception, another exception occurred: 2023-09-23 10:14:04 2023-09-23 10:14:04 Traceback (most recent call last): 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/arbiter.py", line 589, in spawn_worker 2023-09-23 10:14:04 worker.init_process() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/workers/base.py", line 134, in init_process 2023-09-23 10:14:04 self.load_wsgi() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/workers/base.py", line 146, in load_wsgi 2023-09-23 10:14:04 self.wsgi = self.app.wsgi() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/base.py", line 67, in wsgi 2023-09-23 10:14:04 self.callable = self.load() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/wsgiapp.py", line 58, in load 2023-09-23 10:14:04 return self.load_wsgiapp() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/wsgiapp.py", line 48, in load_wsgiapp 2023-09-23 10:14:04 return util.import_app(self.app_uri) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/util.py", line 359, in import_app 2023-09-23 10:14:04 mod = importlib.import_module(module) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/importlib/__init__.py", line 126, in import_module 2023-09-23 10:14:04 return _bootstrap._gcd_import(name[level:], package, level) 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 1050, in _gcd_import 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 1027, in _find_and_load 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 688, in _load_unlocked 2023-09-23 10:14:04 File "<frozen importlib._bootstrap_external>", line 883, in exec_module 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed 2023-09-23 10:14:04 File "/app/application/wsgi.py", line 1, in <module> 2023-09-23 10:14:04 from application.app import app 2023-09-23 10:14:04 File "/app/application/app.py", line 58, in <module> 2023-09-23 10:14:04 tokenizer = AutoTokenizer.from_pretrained(model_id) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 643, in from_pretrained 2023-09-23 10:14:04 tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 487, in get_tokenizer_config 2023-09-23 10:14:04 resolved_config_file = cached_file( 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 433, in cached_file 2023-09-23 10:14:04 raise EnvironmentError( 2023-09-23 10:14:04 OSError: openai_chat is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' 2023-09-23 10:14:04 If this is a private repository, make sure to pass a token having permission to this repo withuse_auth_tokenor log in withhuggingface-cli loginand passuse_auth_token=True. 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [7] [INFO] Worker exiting (pid: 7) 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [1] [WARNING] Worker with pid 7 was terminated due to signal 15 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [1] [INFO] Shutting down: Master 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [1] [INFO] Reason: Worker failed to boot.
Please someone help me i want to install it in my local environment.
Otherwise can you tell me how can i install this directly without using docker i mean
I can run the frontend using
npm run dev (and it works)
but when i try to run flask (application)
python wsgi.py
then it give me error like
module not found : application
ofcourse you have used application name as folders
Hi Guys, ! Important !
I'm trying to install the DocsGPTusing docker container but while running backend it gave me error
[2023-09-23 04:43:18 +0000] [1] [INFO] Starting gunicorn 20.1.0 2023-09-23 10:13:18 [2023-09-23 04:43:18 +0000] [1] [INFO] Listening at: http://0.0.0.0:7091 (1) 2023-09-23 10:13:18 [2023-09-23 04:43:18 +0000] [1] [INFO] Using worker: sync 2023-09-23 10:13:18 [2023-09-23 04:43:18 +0000] [7] [INFO] Booting worker with pid: 7 2023-09-23 10:13:18 [2023-09-23 04:43:18 +0000] [8] [INFO] Booting worker with pid: 8 2023-09-23 10:13:42 [nltk_data] Error loading punkt: <urlopen error EOF occurred in 2023-09-23 10:13:42 [nltk_data] violation of protocol (_ssl.c:1007)> 2023-09-23 10:13:42 [nltk_data] Error loading punkt: <urlopen error EOF occurred in 2023-09-23 10:13:42 [nltk_data] violation of protocol (_ssl.c:1007)> 2023-09-23 10:14:03 [nltk_data] Error loading averaged_perceptron_tagger: <urlopen error 2023-09-23 10:14:03 [nltk_data] EOF occurred in violation of protocol (_ssl.c:1007)> 2023-09-23 10:14:03 [nltk_data] Error loading averaged_perceptron_tagger: <urlopen error 2023-09-23 10:14:03 [nltk_data] EOF occurred in violation of protocol (_ssl.c:1007)> 2023-09-23 10:14:03 None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. 2023-09-23 10:14:03 None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. 2023-09-23 10:14:03 [2023-09-23 04:44:03 +0000] [8] [ERROR] Exception in worker process 2023-09-23 10:14:03 Traceback (most recent call last): 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 259, in hf_raise_for_status 2023-09-23 10:14:03 response.raise_for_status() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status 2023-09-23 10:14:03 raise HTTPError(http_error_msg, response=self) 2023-09-23 10:14:03 requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/openai_chat/resolve/main/tokenizer_config.json 2023-09-23 10:14:03 2023-09-23 10:14:03 The above exception was the direct cause of the following exception: 2023-09-23 10:14:03 2023-09-23 10:14:03 Traceback (most recent call last): 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 417, in cached_file 2023-09-23 10:14:03 resolved_file = hf_hub_download( 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2023-09-23 10:14:03 return fn(*args, **kwargs) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1195, in hf_hub_download 2023-09-23 10:14:03 metadata = get_hf_file_metadata( 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2023-09-23 10:14:03 return fn(*args, **kwargs) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1541, in get_hf_file_metadata 2023-09-23 10:14:03 hf_raise_for_status(r) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 291, in hf_raise_for_status 2023-09-23 10:14:03 raise RepositoryNotFoundError(message, response) from e 2023-09-23 10:14:03 huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-650e6d21-777f5e8b4ffc1ee138437055) 2023-09-23 10:14:03 2023-09-23 10:14:03 Repository Not Found for url: https://huggingface.co/openai_chat/resolve/main/tokenizer_config.json. 2023-09-23 10:14:03 Please make sure you specified the correct
repo_idand
repo_type. 2023-09-23 10:14:03 If you are trying to access a private or gated repo, make sure you are authenticated. 2023-09-23 10:14:03 Invalid username or password. 2023-09-23 10:14:03 2023-09-23 10:14:03 During handling of the above exception, another exception occurred: 2023-09-23 10:14:03 2023-09-23 10:14:03 Traceback (most recent call last): 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/arbiter.py", line 589, in spawn_worker 2023-09-23 10:14:03 worker.init_process() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/workers/base.py", line 134, in init_process 2023-09-23 10:14:03 self.load_wsgi() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/workers/base.py", line 146, in load_wsgi 2023-09-23 10:14:03 self.wsgi = self.app.wsgi() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/base.py", line 67, in wsgi 2023-09-23 10:14:03 self.callable = self.load() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/wsgiapp.py", line 58, in load 2023-09-23 10:14:03 return self.load_wsgiapp() 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/wsgiapp.py", line 48, in load_wsgiapp 2023-09-23 10:14:03 return util.import_app(self.app_uri) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/gunicorn/util.py", line 359, in import_app 2023-09-23 10:14:03 mod = importlib.import_module(module) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/importlib/__init__.py", line 126, in import_module 2023-09-23 10:14:03 return _bootstrap._gcd_import(name[level:], package, level) 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 1050, in _gcd_import 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 1027, in _find_and_load 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 688, in _load_unlocked 2023-09-23 10:14:03 File "<frozen importlib._bootstrap_external>", line 883, in exec_module 2023-09-23 10:14:03 File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed 2023-09-23 10:14:03 File "/app/application/wsgi.py", line 1, in <module> 2023-09-23 10:14:03 from application.app import app 2023-09-23 10:14:03 File "/app/application/app.py", line 58, in <module> 2023-09-23 10:14:03 tokenizer = AutoTokenizer.from_pretrained(model_id) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 643, in from_pretrained 2023-09-23 10:14:03 tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 487, in get_tokenizer_config 2023-09-23 10:14:03 resolved_config_file = cached_file( 2023-09-23 10:14:03 File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 433, in cached_file 2023-09-23 10:14:03 raise EnvironmentError( 2023-09-23 10:14:03 OSError: openai_chat is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' 2023-09-23 10:14:03 If this is a private repository, make sure to pass a token having permission to this repo with
use_auth_tokenor log in with
huggingface-cli loginand pass
use_auth_token=True. 2023-09-23 10:14:03 [2023-09-23 04:44:03 +0000] [8] [INFO] Worker exiting (pid: 8) 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [7] [ERROR] Exception in worker process 2023-09-23 10:14:04 Traceback (most recent call last): 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 259, in hf_raise_for_status 2023-09-23 10:14:04 response.raise_for_status() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status 2023-09-23 10:14:04 raise HTTPError(http_error_msg, response=self) 2023-09-23 10:14:04 requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/openai_chat/resolve/main/tokenizer_config.json 2023-09-23 10:14:04 2023-09-23 10:14:04 The above exception was the direct cause of the following exception: 2023-09-23 10:14:04 2023-09-23 10:14:04 Traceback (most recent call last): 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 417, in cached_file 2023-09-23 10:14:04 resolved_file = hf_hub_download( 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2023-09-23 10:14:04 return fn(*args, **kwargs) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1195, in hf_hub_download 2023-09-23 10:14:04 metadata = get_hf_file_metadata( 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2023-09-23 10:14:04 return fn(*args, **kwargs) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1541, in get_hf_file_metadata 2023-09-23 10:14:04 hf_raise_for_status(r) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 291, in hf_raise_for_status 2023-09-23 10:14:04 raise RepositoryNotFoundError(message, response) from e 2023-09-23 10:14:04 huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-650e6d21-3e161a0a3a20fae47eac6886) 2023-09-23 10:14:04 2023-09-23 10:14:04 Repository Not Found for url: https://huggingface.co/openai_chat/resolve/main/tokenizer_config.json. 2023-09-23 10:14:04 Please make sure you specified the correct
repo_idand
repo_type. 2023-09-23 10:14:04 If you are trying to access a private or gated repo, make sure you are authenticated. 2023-09-23 10:14:04 Invalid username or password. 2023-09-23 10:14:04 2023-09-23 10:14:04 During handling of the above exception, another exception occurred: 2023-09-23 10:14:04 2023-09-23 10:14:04 Traceback (most recent call last): 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/arbiter.py", line 589, in spawn_worker 2023-09-23 10:14:04 worker.init_process() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/workers/base.py", line 134, in init_process 2023-09-23 10:14:04 self.load_wsgi() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/workers/base.py", line 146, in load_wsgi 2023-09-23 10:14:04 self.wsgi = self.app.wsgi() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/base.py", line 67, in wsgi 2023-09-23 10:14:04 self.callable = self.load() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/wsgiapp.py", line 58, in load 2023-09-23 10:14:04 return self.load_wsgiapp() 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/app/wsgiapp.py", line 48, in load_wsgiapp 2023-09-23 10:14:04 return util.import_app(self.app_uri) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/gunicorn/util.py", line 359, in import_app 2023-09-23 10:14:04 mod = importlib.import_module(module) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/importlib/__init__.py", line 126, in import_module 2023-09-23 10:14:04 return _bootstrap._gcd_import(name[level:], package, level) 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 1050, in _gcd_import 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 1027, in _find_and_load 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 688, in _load_unlocked 2023-09-23 10:14:04 File "<frozen importlib._bootstrap_external>", line 883, in exec_module 2023-09-23 10:14:04 File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed 2023-09-23 10:14:04 File "/app/application/wsgi.py", line 1, in <module> 2023-09-23 10:14:04 from application.app import app 2023-09-23 10:14:04 File "/app/application/app.py", line 58, in <module> 2023-09-23 10:14:04 tokenizer = AutoTokenizer.from_pretrained(model_id) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 643, in from_pretrained 2023-09-23 10:14:04 tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 487, in get_tokenizer_config 2023-09-23 10:14:04 resolved_config_file = cached_file( 2023-09-23 10:14:04 File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 433, in cached_file 2023-09-23 10:14:04 raise EnvironmentError( 2023-09-23 10:14:04 OSError: openai_chat is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' 2023-09-23 10:14:04 If this is a private repository, make sure to pass a token having permission to this repo with
use_auth_tokenor log in with
huggingface-cli loginand pass
use_auth_token=True. 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [7] [INFO] Worker exiting (pid: 7) 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [1] [WARNING] Worker with pid 7 was terminated due to signal 15 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [1] [INFO] Shutting down: Master 2023-09-23 10:14:04 [2023-09-23 04:44:04 +0000] [1] [INFO] Reason: Worker failed to boot.
Please someone help me i want to install it in my local environment. Otherwise can you tell me how can i install this directly without using docker i mean I can run the frontend using
npm run dev (and it works)
but when i try to run flask (application)
python wsgi.py
then it give me error like module not found : application
ofcourse you have used application name as folders
please help me