lllyasviel / stable-diffusion-webui-forge

GNU Affero General Public License v3.0
8.07k stars 790 forks source link

lllyasviel/flux1-dev-bnb-nf4 by Python #2118

Open Kovhel1 opened 2 days ago

Kovhel1 commented 2 days ago

I try to go the MODEL: lllyasviel/flux1-dev-bnb-nf4 from https://huggingface.co/lllyasviel/flux1-dev-bnb-nf4 and all the components: VAE: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/ae.safetensors ENCODER: https://huggingface.co/comfyanonymous/flux_text_encoders/blob/main/t5xxl_fp8_e4m3fn.safetensors CLIP: https://huggingface.co/comfyanonymous/flux_text_encoders/blob/main/clip_l.safetensors

My repo now:

flux/
├── ae.safetensors
├── flux1-dev-bnb-nf4.safetensors
├── model_index.json   //made by me
├── tokenizer/
│   └── tokenizer.json   //from https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main/tokenizer_2
├── t5xxl_fp8/
 |   ├── config.json       //from https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main/transformer
│   └── model.safetensors
├── vae/                       
│   ├── config.json       //from here https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main/vae
│   └── diffusion_pytorch_model.safetensors
└── clip_l/
    ├── config.json
    └── model.safetensors

my code:

import os
import torch
from diffusers import AutoencoderKL, FluxTransformer2DModel, FluxPipeline
from transformers import CLIPTextModel, PreTrainedTokenizerFast, T5ForConditionalGeneration,logging

#print(os.path.exists(diffusion_model_path))  # Должно вывести True, если файл существует
#logging.set_verbosity(logging.DEBUG)
torch.cuda.empty_cache() 
torch.device('cuda')

#Components
print('vae')
vae = AutoencoderKL.from_pretrained("./flux/vae")

print('clip_l')
text_encoder = CLIPTextModel.from_pretrained("./flux/clip_l")

print('tokenizer')
tokenizer = PreTrainedTokenizerFast.from_pretrained("./flux/tokenizer")

print('t5xxl_fp8')
t5_model = T5ForConditionalGeneration.from_pretrained("./flux/t5xxl_fp8")

print('flux1-dev')
transformer = FluxTransformer2DModel.from_pretrained("./flux/flux1-dev-bnb-nf4.safetensors")

#Model
print('Create the pipeline with the loaded models...')
model = FluxPipeline(vae=vae, text_encoder=text_encoder, tokenizer=tokenizer, transformer=transformer, t5_model=t5_model)

The error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[11], line 22
     19 tokenizer = PreTrainedTokenizerFast.from_pretrained("./flux/tokenizer")
     21 print('t5xxl_fp8')
---> 22 t5_model = T5ForConditionalGeneration.from_pretrained("./flux/t5xxl_fp8")
     24 print('flux1-dev')
     25 transformer = FluxTransformer2DModel.from_pretrained("./flux/flux1-dev-bnb-nf4.safetensors")

File ~/.local/lib/python3.12/site-packages/transformers/modeling_utils.py:3792, in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, config, cache_dir, ignore_mismatched_sizes, force_download, local_files_only, token, revision, use_safetensors, *model_args, **kwargs)
   3789 with safe_open(resolved_archive_file, framework="pt") as f:
   3790     metadata = f.metadata()
-> 3792 if metadata.get("format") == "pt":
   3793     pass
   3794 elif metadata.get("format") == "tf":

AttributeError: 'NoneType' object has no attribute 'get'

Maybe somebody knows my mistake? P.S. Im really bad with config.files

Thanks!!!!

1272870698 commented 2 days ago

me too ,If you solve it, could you please share it? Thanks a lot

Kovhel1 commented 2 days ago

@1272870698 the problem is not solved but I found smth interesting. Take a look here: https://huggingface.co/HighCWu/FLUX.1-dev-4bit

1272870698 commented 2 days ago

@1272870698 问题没有解决,但我发现了一些有趣的东西。请看这里: https ://huggingface.co/HighCWu/FLUX.1-dev-4bit

cool !! I'll try this. You're amazing. How did you find it?

Kovhel1 commented 1 day ago

@1272870698 i just was walking through the Hugging. I will try to use encoder_2 from this just for fun in better maschine. But as I see now, the code with this decoder also doesn`t work.

Kovhel1 commented 1 day ago

@1272870698 pay attantion on gguf models too. I can`t work with them now, but they also are good for low gpu resources

Kovhel1 commented 1 day ago

@1272870698 I know that there are a lot af working models and veru easy working. But their quality is not soo cool as FLUX. So I decided to use flux`s modifications. Please, tell me your opinion about this repo HighCWu/FLUX.1-dev-4bit](https://huggingface.co/HighCWu/FLUX.1-dev-4bit . What do you think about this quality.