With a fresh and clean install of Shark (nodai_shark_studio_20240430_1250), trying to use use SDXL1.0 as base model from the dropdown menu will cause the compilation to fail due to an error in the repository URL.
[LOG] Initializing new pipeline...
Traceback (most recent call last):
File "huggingface_hub\utils\_errors.py", line 304, in hf_raise_for_status
File "requests\models.py", line 1021, in raise_for_status
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/stabilityai/stable-diffusion-xl-1.0/resolve/main/unet/config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "diffusers\configuration_utils.py", line 371, in load_config
File "huggingface_hub\utils\_validators.py", line 114, in _inner_fn
File "huggingface_hub\file_download.py", line 1221, in hf_hub_download
File "huggingface_hub\file_download.py", line 1325, in _hf_hub_download_to_cache_dir
File "huggingface_hub\file_download.py", line 1823, in _raise_on_head_call_error
File "huggingface_hub\file_download.py", line 1722, in _get_metadata_or_catch_error
File "huggingface_hub\utils\_validators.py", line 114, in _inner_fn
File "huggingface_hub\file_download.py", line 1645, in get_hf_file_metadata
File "huggingface_hub\file_download.py", line 372, in _request_wrapper
File "huggingface_hub\file_download.py", line 396, in _request_wrapper
File "huggingface_hub\utils\_errors.py", line 352, in hf_raise_for_status
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-664d3d47-6e28e22e7b0c918634b8a307;23a5b50e-b03c-4420-ba29-b4fd3891ac37)
Repository Not Found for url: https://huggingface.co/stabilityai/stable-diffusion-xl-1.0/resolve/main/unet/config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\CYBERN~1\AppData\Local\Temp\_MEI88162\gradio\queueing.py", line 495, in call_prediction
output = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CYBERN~1\AppData\Local\Temp\_MEI88162\gradio\route_utils.py", line 235, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CYBERN~1\AppData\Local\Temp\_MEI88162\gradio\blocks.py", line 1627, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CYBERN~1\AppData\Local\Temp\_MEI88162\gradio\blocks.py", line 1185, in call_function
prediction = await utils.async_iteration(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CYBERN~1\AppData\Local\Temp\_MEI88162\gradio\utils.py", line 514, in async_iteration
return await iterator.__anext__()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\CYBERN~1\AppData\Local\Temp\_MEI88162\gradio\utils.py", line 507, in __anext__
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "anyio\to_thread.py", line 56, in run_sync
File "anyio\_backends\_asyncio.py", line 2144, in run_sync_in_worker_thread
File "anyio\_backends\_asyncio.py", line 851, in run
File "C:\Users\CYBERN~1\AppData\Local\Temp\_MEI88162\gradio\utils.py", line 490, in run_sync_iterator_async
return next(iterator)
^^^^^^^^^^^^^^
File "C:\Users\CYBERN~1\AppData\Local\Temp\_MEI88162\gradio\utils.py", line 673, in gen_wrapper
response = next(iterator)
^^^^^^^^^^^^^^
File "apps\shark_studio\api\sd.py", line 441, in shark_sd_fn_dict_input
File "apps\shark_studio\api\sd.py", line 553, in shark_sd_fn
File "apps\shark_studio\api\sd.py", line 86, in __init__
File "turbine_models\custom_models\sd_inference\unet.py", line 71, in __init__
File "diffusers\models\modeling_utils.py", line 712, in from_pretrained
File "diffusers\configuration_utils.py", line 385, in load_config
OSError: stabilityai/stable-diffusion-xl-1.0 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login`.
With a fresh and clean install of Shark (nodai_shark_studio_20240430_1250), trying to use use SDXL1.0 as base model from the dropdown menu will cause the compilation to fail due to an error in the repository URL.