Open PatelKrunal0710 opened 2 months ago
I am getting the same error.
make sure environment variable HUGGINGFACE_TOKEN is present and correct.
if using docker, for some reason they use 100:100 for 'worker' user, so you need this:
chown 100:100 models local_data
or, if you want to be like me, alter Dockerfile.external to force uid 1000:
RUN adduser --system -u 1000 worker
When you have this up, you'll need to run:
ollama pull mistral
ollama pull nomic-embed-text
if using the default settings in the repo
When you have this up, you'll need to run:
ollama pull mistral ollama pull nomic-embed-text
if using the default settings in the repo
@stevenlafl Quite new around here, but where do you mean we should run the above quoted code?. Whenever I run docker compose up
the private-gpt
container stops and I cannot run exec
to get inside of it
The same is happening to me with podman. I'll test @stevenlafl solution of forcing the uid. @dmmarmol as soon as you start the container because from the beginning, there are no models on Ollama. I think it makes sense to keep it separate; otherwise, the image will be too bloated; you can, however, use volumes to revert to previously downloaded or fine-tuned models.
@stevenlafl It is strange, however, that two weeks ago everything was working. Do you know what happened?
Hi All,
I am getting the same error today. Any hint to resolve it ?
I tried as mentioned by @stevenlafl in Dockerfile.external as below, Still same problem.. RUN adduser --system --uid 1000 worker
2024-05-29 12:30:32 PermissionError: [Errno 13] Permission denied: 'tiktoken_cache'
Regards, Vijayan Ramachandran
I tried as mentioned by @stevenlafl in Dockerfile.external as below, Still same problem.. RUN adduser --system --uid 1000 worker
2024-05-29 12:30:32 PermissionError: [Errno 13] Permission denied: 'tiktoken_cache'
Outside of Docker, what is your uid? The default for many Linux distros is 1000, so I set the image uuid to match the hosts. If your uid is not 1000, you will still see permission errors.
If that doesn't work, you might want to rm or chown it to try it again, because it may have been created with the wrong ownership.
@stevenlafl Quite new around here, but where do you mean we should run the above quoted code?. Whenever I run
docker compose up
theprivate-gpt
container stops and I cannot runexec
to get inside of it
I meant to temporarily modify the docker-compose to set tty enabled and entrypoint to /bin/bash
, enabling you to go into the shell and run those commands. Or alternatively run docker exec -it
directly with the same volume mounts as the docker-compose has.
I had the same error recently, and add this line to Dockerfile.external:
RUN chown worker /home/worker/app
It's working now.
Thanks @wlane it is working now.
Thanks @stevenlafl
Below is the docker log for your reference. 2024-04-22 16:53:57 11:23:57.071 [INFO ] private_gpt.settings.settings_loader - Starting application with profiles=['default', 'docker'] 2024-04-22 16:53:59 There was a problem when trying to write in your cache folder (/nonexistent/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory. 2024-04-22 16:53:59 None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. 2024-04-22 16:54:00 11:24:00.684 [WARNING ] matplotlib - Matplotlib created a temporary cache directory at /tmp/matplotlib-03i7r7k7 because the default path (/nonexistent/.config/matplotlib) is not a writable directory; it is highly recommended to set the MPLCONFIGDIR environment variable to a writable directory, in particular to speed up the import of Matplotlib and to better support multiprocessing. 2024-04-22 16:54:00 11:24:00.928 [INFO ] matplotlib.font_manager - generated new fontManager 2024-04-22 16:54:03 --- Logging error --- 2024-04-22 16:54:03 Traceback (most recent call last): 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 798, in get 2024-04-22 16:54:03 return self._context[key] 2024-04-22 16:54:03", line 198, in _run_module_as_main
2024-04-22 16:54:03 File "", line 88, in _run_code
2024-04-22 16:54:03 File "/home/worker/app/private_gpt/main.py", line 5, in
2024-04-22 16:54:03 from private_gpt.main import app
2024-04-22 16:54:03 File "", line 1176, in _find_and_load
2024-04-22 16:54:03 File "", line 1147, in _find_and_load_unlocked
2024-04-22 16:54:03 File "", line 690, in _load_unlocked
2024-04-22 16:54:03 File "", line 940, in exec_module
2024-04-22 16:54:03 File "", line 241, in _call_with_frames_removed
2024-04-22 16:54:03 File "/home/worker/app/private_gpt/main.py", line 6, in
2024-04-22 16:54:03 app = create_app(global_injector)
2024-04-22 16:54:03 File "/home/worker/app/private_gpt/launcher.py", line 63, in create_app
2024-04-22 16:54:03 ui = root_injector.get(PrivateGptUi)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:03 return function( args, kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 974, in get
2024-04-22 16:54:03 provider_instance = scope_instance.get(interface, binding.provider)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:03 return function(*args, kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 800, in get
2024-04-22 16:54:03 instance = self._get_instance(key, provider, self.injector)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 811, in _get_instance
2024-04-22 16:54:03 return provider.get(injector)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 264, in get
2024-04-22 16:54:03 return injector.create_object(self._cls)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 998, in create_object
2024-04-22 16:54:03 self.call_withinjection(init, self=instance, kwargs=additional_kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 1031, in call_with_injection
2024-04-22 16:54:03 dependencies = self.args_to_inject(
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:03 return function(*args, *kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 1079, in args_to_inject
2024-04-22 16:54:03 instance: Any = self.get(interface)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:03 return function(args, kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 974, in get
2024-04-22 16:54:03 provider_instance = scope_instance.get(interface, binding.provider)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:03 return function(*args, kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 800, in get
2024-04-22 16:54:03 instance = self._get_instance(key, provider, self.injector)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 811, in _get_instance
2024-04-22 16:54:03 return provider.get(injector)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 264, in get
2024-04-22 16:54:03 return injector.create_object(self._cls)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 998, in create_object
2024-04-22 16:54:03 self.call_withinjection(init, self=instance, kwargs=additional_kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 1031, in call_with_injection
2024-04-22 16:54:03 dependencies = self.args_to_inject(
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:03 return function(*args, *kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 1079, in args_to_inject
2024-04-22 16:54:03 instance: Any = self.get(interface)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:03 return function(args, kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 974, in get
2024-04-22 16:54:03 provider_instance = scope_instance.get(interface, binding.provider)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:03 return function(*args, kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 800, in get
2024-04-22 16:54:03 instance = self._get_instance(key, provider, self.injector)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 811, in _get_instance
2024-04-22 16:54:03 return provider.get(injector)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 264, in get
2024-04-22 16:54:03 return injector.create_object(self._cls)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 998, in create_object
2024-04-22 16:54:03 self.call_withinjection(init, self=instance, kwargs=additional_kwargs)
2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 1040, in call_with_injection
2024-04-22 16:54:03 return callable(*full_args, *dependencies)
2024-04-22 16:54:03 File "/home/worker/app/private_gpt/components/llm/llm_component.py", line 37, in init
2024-04-22 16:54:03 logger.warning(
2024-04-22 16:54:03 Message: 'Failed to download tokenizer %s. Falling back to default tokenizer.'
2024-04-22 16:54:03 Arguments: ('mistralai/Mistral-7B-Instruct-v0.2', OSError('You are trying to access a gated repo.\nMake sure to have access to it at https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2.\n401 Client Error. (Request ID: Root=1-662648d3-7421a33e51f4ccba624d63e0;de92b4f1-e8fd-43d2-a97c-b8e591d234b6)\n\nCannot access gated repo for url https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json.\nRepo model mistralai/Mistral-7B-Instruct-v0.2 is gated. You must be authenticated to access it.'))
2024-04-22 16:54:03 11:24:03.602 [INFO ] private_gpt.components.llm.llm_component - Initializing the LLM in mode=ollama
2024-04-22 16:54:04 11:24:04.833 [INFO ] private_gpt.components.embedding.embedding_component - Initializing the embedding model in mode=ollama
2024-04-22 16:54:04 11:24:04.847 [INFO ] llama_index.core.indices.loading - Loading all indices.
2024-04-22 16:54:09 Traceback (most recent call last):
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 798, in get
2024-04-22 16:54:09 return self._context[key]
2024-04-22 16:54:09 ", line 198, in _run_module_as_main
2024-04-22 16:54:09 File "", line 88, in _run_code
2024-04-22 16:54:09 File "/home/worker/app/private_gpt/main.py", line 5, in
2024-04-22 16:54:09 from private_gpt.main import app
2024-04-22 16:54:09 File "/home/worker/app/private_gpt/main.py", line 6, in
2024-04-22 16:54:09 app = create_app(global_injector)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/private_gpt/launcher.py", line 63, in create_app
2024-04-22 16:54:09 ui = root_injector.get(PrivateGptUi)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:09 return function( args, kwargs)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 974, in get
2024-04-22 16:54:09 provider_instance = scope_instance.get(interface, binding.provider)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:09 return function(*args, kwargs)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 800, in get
2024-04-22 16:54:09 instance = self._get_instance(key, provider, self.injector)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 811, in _get_instance
2024-04-22 16:54:09 return provider.get(injector)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 264, in get
2024-04-22 16:54:09 return injector.create_object(self._cls)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 998, in create_object
2024-04-22 16:54:09 self.call_withinjection(init, self=instance, kwargs=additional_kwargs)
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 1031, in call_with_injection
2024-04-22 16:54:09 dependencies = self.args_to_inject(
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:09 return function(*args, *kwargs)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 1079, in args_to_inject
2024-04-22 16:54:09 instance: Any = self.get(interface)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:09 return function(args, kwargs)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 974, in get
2024-04-22 16:54:09 provider_instance = scope_instance.get(interface, binding.provider)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
2024-04-22 16:54:09 return function(*args, *kwargs)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 800, in get
2024-04-22 16:54:09 instance = self._get_instance(key, provider, self.injector)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 811, in _get_instance
2024-04-22 16:54:09 return provider.get(injector)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 264, in get
2024-04-22 16:54:09 return injector.create_object(self._cls)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 998, in create_object
2024-04-22 16:54:09 self.call_withinjection(init, self=instance, kwargs=additional_kwargs)
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 1040, in call_with_injection
2024-04-22 16:54:09 return callable(full_args, dependencies)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/private_gpt/server/chat/chat_service.py", line 96, in init
2024-04-22 16:54:09 self.index = VectorStoreIndex.from_vector_store(
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 103, in from_vector_store
2024-04-22 16:54:09 return cls(
2024-04-22 16:54:09 ^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 74, in init
2024-04-22 16:54:09 super().init(
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/llama_index/core/indices/base.py", line 99, in init
2024-04-22 16:54:09 or transformations_from_settings_or_context(Settings, service_context)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/llama_index/core/settings.py", line 304, in transformations_from_settings_or_context
2024-04-22 16:54:09 return settings.transformations
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/llama_index/core/settings.py", line 241, in transformations
2024-04-22 16:54:09 self._transformations = [self.node_parser]
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/llama_index/core/settings.py", line 144, in node_parser
2024-04-22 16:54:09 self._node_parser = SentenceSplitter()
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/llama_index/core/node_parser/text/sentence.py", line 91, in init
2024-04-22 16:54:09 self._tokenizer = tokenizer or get_tokenizer()
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/llama_index/core/utils.py", line 129, in get_tokenizer
2024-04-22 16:54:09 enc = tiktoken.encoding_for_model("gpt-3.5-turbo")
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/tiktoken/model.py", line 97, in encoding_for_model
2024-04-22 16:54:09 return get_encoding(encoding_name_for_model(model_name))
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/tiktoken/registry.py", line 73, in get_encoding
2024-04-22 16:54:09 enc = Encoding(constructor())
2024-04-22 16:54:09 ^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/tiktoken_ext/openai_public.py", line 64, in cl100k_base
2024-04-22 16:54:09 mergeable_ranks = load_tiktoken_bpe(
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/tiktoken/load.py", line 123, in load_tiktoken_bpe
2024-04-22 16:54:09 contents = read_file_cached(tiktoken_bpe_file)
2024-04-22 16:54:09 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/tiktoken/load.py", line 53, in read_file_cached
2024-04-22 16:54:09 os.makedirs(cache_dir, exist_ok=True)
2024-04-22 16:54:09 File "", line 225, in makedirs
2024-04-22 16:54:09 PermissionError: [Errno 13] Permission denied: 'tiktoken_cache'
~~~~~^^^^^ 2024-04-22 16:54:03 KeyError: <class 'private_gpt.ui.ui.PrivateGptUi'> 2024-04-22 16:54:03 2024-04-22 16:54:03 During handling of the above exception, another exception occurred: 2024-04-22 16:54:03 2024-04-22 16:54:03 Traceback (most recent call last): 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 798, in get 2024-04-22 16:54:03 return self._context[key] 2024-04-22 16:54:03~~~~~^^^^^ 2024-04-22 16:54:03 KeyError: <class 'private_gpt.server.ingest.ingest_service.IngestService'> 2024-04-22 16:54:03 2024-04-22 16:54:03 During handling of the above exception, another exception occurred: 2024-04-22 16:54:03 2024-04-22 16:54:03 Traceback (most recent call last): 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 798, in get 2024-04-22 16:54:03 return self._context[key] 2024-04-22 16:54:03~~~~~^^^^^ 2024-04-22 16:54:03 KeyError: <class 'private_gpt.components.llm.llm_component.LLMComponent'> 2024-04-22 16:54:03 2024-04-22 16:54:03 During handling of the above exception, another exception occurred: 2024-04-22 16:54:03 2024-04-22 16:54:03 Traceback (most recent call last): 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 270, in hf_raise_for_status 2024-04-22 16:54:03 response.raise_for_status() 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status 2024-04-22 16:54:03 raise HTTPError(http_error_msg, response=self) 2024-04-22 16:54:03 requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json 2024-04-22 16:54:03 2024-04-22 16:54:03 The above exception was the direct cause of the following exception: 2024-04-22 16:54:03 2024-04-22 16:54:03 Traceback (most recent call last): 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/transformers/utils/hub.py", line 398, in cached_file 2024-04-22 16:54:03 resolved_file = hf_hub_download( 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2024-04-22 16:54:03 return fn(*args, kwargs) 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1374, in hf_hub_download 2024-04-22 16:54:03 raise head_call_error 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1247, in hf_hub_download 2024-04-22 16:54:03 metadata = get_hf_file_metadata( 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn 2024-04-22 16:54:03 return fn(*args, kwargs) 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1624, in get_hf_file_metadata 2024-04-22 16:54:03 r = _request_wrapper( 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 402, in _request_wrapper 2024-04-22 16:54:03 response = _request_wrapper( 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 426, in _request_wrapper 2024-04-22 16:54:03 hf_raise_for_status(response) 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 286, in hf_raise_for_status 2024-04-22 16:54:03 raise GatedRepoError(message, response) from e 2024-04-22 16:54:03 huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-662648d3-7421a33e51f4ccba624d63e0;de92b4f1-e8fd-43d2-a97c-b8e591d234b6) 2024-04-22 16:54:03 2024-04-22 16:54:03 Cannot access gated repo for url https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json. 2024-04-22 16:54:03 Repo model mistralai/Mistral-7B-Instruct-v0.2 is gated. You must be authenticated to access it. 2024-04-22 16:54:03 2024-04-22 16:54:03 The above exception was the direct cause of the following exception: 2024-04-22 16:54:03 2024-04-22 16:54:03 Traceback (most recent call last): 2024-04-22 16:54:03 File "/home/worker/app/private_gpt/components/llm/llm_component.py", line 30, in init 2024-04-22 16:54:03 AutoTokenizer.from_pretrained( 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 782, in from_pretrained 2024-04-22 16:54:03 config = AutoConfig.from_pretrained( 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1111, in from_pretrained 2024-04-22 16:54:03 config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/transformers/configuration_utils.py", line 633, in get_config_dict 2024-04-22 16:54:03 config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, *kwargs) 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/transformers/configuration_utils.py", line 688, in _get_config_dict 2024-04-22 16:54:03 resolved_config_file = cached_file( 2024-04-22 16:54:03 ^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/home/worker/app/.venv/lib/python3.11/site-packages/transformers/utils/hub.py", line 416, in cached_file 2024-04-22 16:54:03 raise EnvironmentError( 2024-04-22 16:54:03 OSError: You are trying to access a gated repo. 2024-04-22 16:54:03 Make sure to have access to it at https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2. 2024-04-22 16:54:03 401 Client Error. (Request ID: Root=1-662648d3-7421a33e51f4ccba624d63e0;de92b4f1-e8fd-43d2-a97c-b8e591d234b6) 2024-04-22 16:54:03 2024-04-22 16:54:03 Cannot access gated repo for url https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json. 2024-04-22 16:54:03 Repo model mistralai/Mistral-7B-Instruct-v0.2 is gated. You must be authenticated to access it. 2024-04-22 16:54:03 2024-04-22 16:54:03 During handling of the above exception, another exception occurred: 2024-04-22 16:54:03 2024-04-22 16:54:03 Traceback (most recent call last): 2024-04-22 16:54:03 File "/usr/local/lib/python3.11/logging/init.py", line 1110, in emit 2024-04-22 16:54:03 msg = self.format(record) 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/usr/local/lib/python3.11/logging/init.py", line 953, in format 2024-04-22 16:54:03 return fmt.format(record) 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/usr/local/lib/python3.11/logging/init.py", line 687, in format 2024-04-22 16:54:03 record.message = record.getMessage() 2024-04-22 16:54:03 ^^^^^^^^^^^^^^^^^^^ 2024-04-22 16:54:03 File "/usr/local/lib/python3.11/logging/init.py", line 377, in getMessage 2024-04-22 16:54:03 msg = msg % self.args 2024-04-22 16:54:03~~^~~~~ 2024-04-22 16:54:03 TypeError: not all arguments converted during string formatting 2024-04-22 16:54:03 Call stack: 2024-04-22 16:54:03 File "~~~~~^^^^^ 2024-04-22 16:54:09 KeyError: <class 'private_gpt.ui.ui.PrivateGptUi'> 2024-04-22 16:54:09 2024-04-22 16:54:09 During handling of the above exception, another exception occurred: 2024-04-22 16:54:09 2024-04-22 16:54:09 Traceback (most recent call last): 2024-04-22 16:54:09 File "/home/worker/app/.venv/lib/python3.11/site-packages/injector/init.py", line 798, in get 2024-04-22 16:54:09 return self._context[key] 2024-04-22 16:54:09~~~~~^^^^^ 2024-04-22 16:54:09 KeyError: <class 'private_gpt.server.chat.chat_service.ChatService'> 2024-04-22 16:54:09 2024-04-22 16:54:09 During handling of the above exception, another exception occurred: 2024-04-22 16:54:09 2024-04-22 16:54:09 Traceback (most recent call last): 2024-04-22 16:54:09 File "