Open zaffry007 opened 2 months ago
I had the same error and I fixed it by setting a default value to num_inference_steps:
def update_slider(checkpoint, num_inference_steps=4): if checkpoint == "sayakpaul/FLUX.1-merged": return 8 else: return 4
I have another strange problem that the app abruptly exits without error.... The app closes when line 72 is run in the infer function: transformer = QuantizedFluxTransformer2DModel.from_pretrained("cocktailpeanut/flux1-schnell-q8")
I tried to put it in a try except but the app just crashes with no report.
I have another strange problem that the app abruptly exits without error.... The app closes when line 72 is run in the infer function: transformer = QuantizedFluxTransformer2DModel.from_pretrained("cocktailpeanut/flux1-schnell-q8")
I tried to put it in a try except but the app just crashes with no report.
I have having this issue as well. It is also being reported as Failure to Download
When ever I start to generate its just stuck here with 100% GPU being used and nearly 1 hour without any progress. Finally the Web UI says Error
e:\pinokio\api\flux-webui.git>conda_hook && conda deactivate && conda deactivate && conda deactivate && conda activate base && e:\pinokio\api\flux-webui.git\env\Scripts\activate e:\pinokio\api\flux-webui.git\env && python app.py
e:\pinokio\api\flux-webui.git\env\lib\site-packages\xformers\ops\fmha\flash.py:211: FutureWarning:
torch.library.impl_abstract
was renamed totorch.library.register_fake
. Please use that instead; we will removetorch.library.impl_abstract
in a future version of PyTorch. @torch.library.impl_abstract("xformers_flash::flash_fwd") e:\pinokio\api\flux-webui.git\env\lib\site-packages\xformers\ops\fmha\flash.py:344: FutureWarning:torch.library.impl_abstract
was renamed totorch.library.register_fake
. Please use that instead; we will removetorch.library.impl_abstract
in a future version of PyTorch. @torch.library.impl_abstract("xformers_flash::flash_bwd") e:\pinokio\api\flux-webui.git\env\lib\site-packages\gradio\utils.py:1002: UserWarning: Expected 2 arguments for function <function update_slider at 0x0000020952E6D630>, received 1. warnings.warn( e:\pinokio\api\flux-webui.git\env\lib\site-packages\gradio\utils.py:1006: UserWarning: Expected at least 2 arguments for function <function update_slider at 0x0000020952E6D630>, received 1. warnings.warn( Running on local URL: http://127.0.0.1:7860To create a public link, set
share=True
inlaunch()
.e:\pinokio\api\flux-webui.git\env\lib\site-packages\gradio\helpers.py:978: UserWarning: Unexpected argument. Filling with None. warnings.warn("Unexpected argument. Filling with None.") initializing quantized transformer...
initialized! moving device to cuda initializing pipeline... You set
add_prefix_space
. The tokenizer needs to be converted from the slow tokenizers initialized! enable model cpu offload... done! Started the inference. Wait... e:\pinokio\api\flux-webui.git\env\lib\site-packages\diffusers\models\attention_processor.py:1848: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at C:\actions-runner_work\pytorch\pytorch\builder\windows\pytorch\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:555.) hidden_states = F.scaled_dot_product_attention(query, key, value, dropout_p=0.0, is_causal=False)