Open zeuszl1 opened 7 months ago
seems like https://huggingface.co/api/models/stablediffusionapi/juggernaut-xl-v8 is removed. @aitrepreneur what alternatives can we use?
You could try fixing this by using another SDXL model. For example, you could run app.py with the --model_path argument set to "RunDiffusion/Juggernaut-XL-v8" .
You could try fixing this by using another SDXL model. For example, you could run app.py with the --model_path argument set to "RunDiffusion/Juggernaut-XL-v8" .
I've tried, several different models, but it always crashes with an error on startup. I ran the code on google colab. I changed the models by changing the code in the app.py file in line 422. If anyone has been able to solve the error please drop me a comment. Below is the code with the error. File "/usr/local/lib/python3.10/dist-packages/gradio/queueing.py", line 495, in call_prediction
output = await route_utils.call_process_api(
File "/usr/local/lib/python3.10/dist-packages/gradio/route_utils.py", line 232, in call_process_api
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/dist-packages/gradio/blocks.py", line 1561, in process_api
result = await self.call_function(
File "/usr/local/lib/python3.10/dist-packages/gradio/blocks.py", line 1179, in call_function
prediction = await anyio.to_thread.run_sync(
File "/usr/local/lib/python3.10/dist-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 807, in run
result = context.run(func, args)
File "/usr/local/lib/python3.10/dist-packages/gradio/utils.py", line 678, in wrapper
response = f(args, kwargs)
File "/usr/local/lib/python3.10/dist-packages/gradio/utils.py", line 678, in wrapper
response = f(*args, *kwargs)
File "/content/INSTID/app.py", line 231, in generate_image
images = pipe(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(args, kwargs)
File "/content/INSTID/pipeline_stable_diffusion_xl_instantid.py", line 794, in call
self.check_inputs(
File "/usr/local/lib/python3.10/dist-packages/diffusers/pipelines/controlnet/pipeline_controlnet_sd_xl.py", line 684, in check_inputs
raise TypeError("For single controlnet: controlnet_conditioning_scale
must be type float
.")
TypeError: For single controlnet: controlnet_conditioning_scale
must be type float
.
Getting this error after using the autoinstall.bast and when trying to install manually. - shows that repo is not found any help would be much appreciated.
Code: A matching Triton is not available, some optimizations will not be enabled Traceback (most recent call last): File "D:\AI\InstantID-Controlnet\env\lib\site-packages\xformers__init__.py", line 55, in _is_triton_available from xformers.triton.softmax import softmax as triton_softmax # noqa File "D:\AI\InstantID-Controlnet\env\lib\site-packages\xformers\triton\softmax.py", line 11, in
import triton
ModuleNotFoundError: No module named 'triton'
D:\AI\InstantID-Controlnet\env\lib\site-packages\diffusers\utils\outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
torch.utils._pytree._register_pytree_node(
D:\AI\InstantID-Controlnet\env\lib\site-packages\diffusers\utils\outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
torch.utils._pytree._register_pytree_node(
D:\AI\InstantID-Controlnet\env\lib\site-packages\controlnet_aux\mediapipe_face\mediapipe_face_common.py:7: UserWarning: The module 'mediapipe' is not installed. The package will have limited functionality. Please install it using the command: pip install 'mediapipe'
warnings.warn(
D:\AI\InstantID-Controlnet\env\lib\site-packages\controlnet_aux\segment_anything\modeling\tiny_vit_sam.py:654: UserWarning: Overwriting tiny_vit_5m_224 in registry with controlnet_aux.segment_anything.modeling.tiny_vit_sam.tiny_vit_5m_224. This is because the name being registered conflicts with an existing name. Please check if this is not expected.
return register_model(fn_wrapper)
D:\AI\InstantID-Controlnet\env\lib\site-packages\controlnet_aux\segment_anything\modeling\tiny_vit_sam.py:654: UserWarning: Overwriting tiny_vit_11m_224 in registry with controlnet_aux.segment_anything.modeling.tiny_vit_sam.tiny_vit_11m_224. This is because the name being registered conflicts with an existing name. Please check if this is not expected.
return register_model(fn_wrapper)
D:\AI\InstantID-Controlnet\env\lib\site-packages\controlnet_aux\segment_anything\modeling\tiny_vit_sam.py:654: UserWarning: Overwriting tiny_vit_21m_224 in registry with controlnet_aux.segment_anything.modeling.tiny_vit_sam.tiny_vit_21m_224. This is because the name being registered conflicts with an existing name. Please check if this is not expected.
return register_model(fn_wrapper)
D:\AI\InstantID-Controlnet\env\lib\site-packages\controlnet_aux\segment_anything\modeling\tiny_vit_sam.py:654: UserWarning: Overwriting tiny_vit_21m_384 in registry with controlnet_aux.segment_anything.modeling.tiny_vit_sam.tiny_vit_21m_384. This is because the name being registered conflicts with an existing name. Please check if this is not expected.
return register_model(fn_wrapper)
D:\AI\InstantID-Controlnet\env\lib\site-packages\controlnet_aux\segment_anything\modeling\tiny_vit_sam.py:654: UserWarning: Overwriting tiny_vit_21m_512 in registry with controlnet_aux.segment_anything.modeling.tiny_vit_sam.tiny_vit_21m_512. This is because the name being registered conflicts with an existing name. Please check if this is not expected.
return register_model(fn_wrapper)
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: ./models\antelopev2\1k3d68.onnx landmark_3d_68 ['None', 3, 192, 192] 0.0 1.0
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: ./models\antelopev2\2d106det.onnx landmark_2d_106 ['None', 3, 192, 192] 0.0 1.0
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: ./models\antelopev2\genderage.onnx genderage ['None', 3, 96, 96] 0.0 1.0
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: ./models\antelopev2\glintr100.onnx recognition ['None', 3, 112, 112] 127.5 127.5
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: ./models\antelopev2\scrfd_10g_bnkps.onnx detection [1, 3, '?', '?'] 127.5 128.0
set det-size: (640, 640)
Couldn't connect to the Hub: 401 Client Error. (Request ID: Root=1-65fa04e9-6393cec82e2f26366c39bcde;6671a56a-cd16-4ecb-81fc-c3995c4569a4)
Repository Not Found for url: https://huggingface.co/api/models/stablediffusionapi/juggernaut-xl-v8. Please make sure you specified the correct
repo_id
andrepo_type
. If you are trying to access a private or gated repo, make sure you are authenticated. Invalid username or password.. Will try to load from local cache. Traceback (most recent call last): File "D:\AI\InstantID-Controlnet\env\lib\site-packages\huggingface_hub\utils_errors.py", line 286, in hf_raise_for_status response.raise_for_status() File "D:\AI\InstantID-Controlnet\env\lib\site-packages\requests\models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/models/stablediffusionapi/juggernaut-xl-v8The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "D:\AI\InstantID-Controlnet\env\lib\site-packages\diffusers\pipelines\pipeline_utils.py", line 1656, in download info = model_info(pretrained_model_name, token=token, revision=revision) File "D:\AI\InstantID-Controlnet\env\lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) File "D:\AI\InstantID-Controlnet\env\lib\site-packages\huggingface_hub\hf_api.py", line 2085, in model_info hf_raise_for_status(r) File "D:\AI\InstantID-Controlnet\env\lib\site-packages\huggingface_hub\utils_errors.py", line 323, in hf_raise_for_status raise RepositoryNotFoundError(message, response) from e huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65fa04e9-6393cec82e2f26366c39bcde;6671a56a-cd16-4ecb-81fc-c3995c4569a4)
Repository Not Found for url: https://huggingface.co/api/models/stablediffusionapi/juggernaut-xl-v8. Please make sure you specified the correct
repo_id
andrepo_type
. If you are trying to access a private or gated repo, make sure you are authenticated. Invalid username or password.The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "D:\AI\InstantID-Controlnet\app.py", line 158, in
pipe = StableDiffusionXLInstantIDPipeline.from_pretrained(
File "D:\AI\InstantID-Controlnet\env\lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn
return fn(*args, *kwargs)
File "D:\AI\InstantID-Controlnet\env\lib\site-packages\diffusers\pipelines\pipeline_utils.py", line 1096, in from_pretrained
cached_folder = cls.download(
File "D:\AI\InstantID-Controlnet\env\lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn
return fn(args, **kwargs)
File "D:\AI\InstantID-Controlnet\env\lib\site-packages\diffusers\pipelines\pipeline_utils.py", line 1905, in download
raise EnvironmentError(
OSError: Cannot load model stablediffusionapi/juggernaut-xl-v8: model is not cached locally and an error occured while trying to fetch metadata from the Hub. Please check out the root cause in the stacktrace above.