pinokiofactory / flux-webui

143 stars 19 forks source link

Not Generating Image #10

Open zaffry007 opened 2 months ago

zaffry007 commented 2 months ago

When ever I start to generate its just stuck here with 100% GPU being used and nearly 1 hour without any progress. Finally the Web UI says Error

e:\pinokio\api\flux-webui.git>conda_hook && conda deactivate && conda deactivate && conda deactivate && conda activate base && e:\pinokio\api\flux-webui.git\env\Scripts\activate e:\pinokio\api\flux-webui.git\env && python app.py

e:\pinokio\api\flux-webui.git\env\lib\site-packages\xformers\ops\fmha\flash.py:211: FutureWarning: torch.library.impl_abstract was renamed to torch.library.register_fake. Please use that instead; we will remove torch.library.impl_abstract in a future version of PyTorch. @torch.library.impl_abstract("xformers_flash::flash_fwd") e:\pinokio\api\flux-webui.git\env\lib\site-packages\xformers\ops\fmha\flash.py:344: FutureWarning: torch.library.impl_abstract was renamed to torch.library.register_fake. Please use that instead; we will remove torch.library.impl_abstract in a future version of PyTorch. @torch.library.impl_abstract("xformers_flash::flash_bwd") e:\pinokio\api\flux-webui.git\env\lib\site-packages\gradio\utils.py:1002: UserWarning: Expected 2 arguments for function <function update_slider at 0x0000020952E6D630>, received 1. warnings.warn( e:\pinokio\api\flux-webui.git\env\lib\site-packages\gradio\utils.py:1006: UserWarning: Expected at least 2 arguments for function <function update_slider at 0x0000020952E6D630>, received 1. warnings.warn( Running on local URL: http://127.0.0.1:7860

To create a public link, set share=True in launch().

e:\pinokio\api\flux-webui.git\env\lib\site-packages\gradio\helpers.py:978: UserWarning: Unexpected argument. Filling with None. warnings.warn("Unexpected argument. Filling with None.") initializing quantized transformer...

initialized! moving device to cuda initializing pipeline... You set add_prefix_space. The tokenizer needs to be converted from the slow tokenizers initialized! enable model cpu offload... done! Started the inference. Wait... e:\pinokio\api\flux-webui.git\env\lib\site-packages\diffusers\models\attention_processor.py:1848: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at C:\actions-runner_work\pytorch\pytorch\builder\windows\pytorch\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:555.) hidden_states = F.scaled_dot_product_attention(query, key, value, dropout_p=0.0, is_causal=False)

eddielthomas commented 2 months ago

I had the same error and I fixed it by setting a default value to num_inference_steps:

def update_slider(checkpoint, num_inference_steps=4): if checkpoint == "sayakpaul/FLUX.1-merged": return 8 else: return 4

eddielthomas commented 2 months ago

I have another strange problem that the app abruptly exits without error.... The app closes when line 72 is run in the infer function: transformer = QuantizedFluxTransformer2DModel.from_pretrained("cocktailpeanut/flux1-schnell-q8")

I tried to put it in a try except but the app just crashes with no report.

ahosch commented 2 months ago

I have another strange problem that the app abruptly exits without error.... The app closes when line 72 is run in the infer function: transformer = QuantizedFluxTransformer2DModel.from_pretrained("cocktailpeanut/flux1-schnell-q8")

I tried to put it in a try except but the app just crashes with no report.

I have having this issue as well. It is also being reported as Failure to Download