black-forest-labs / flux

Official inference repo for FLUX.1 models
Apache License 2.0
14.43k stars 1.04k forks source link

I can't get it to work #145

Open equaerdist opened 3 weeks ago

equaerdist commented 3 weeks ago

I did everything according to the instructions

git clone https://github.com/black-forest-labs/flux
python -m venv .venv
source .venv/Scripts/activate
pip install -e “.[all]”

then run the model with python -m flux --name “flux-schnell” --loop. The first time I had a bin file of 45gb downloaded when everything was loaded and the next times I got a warning message

You are using the default legacy behaviour of the <class 'transformers.models.t5.tokenization_t5.T5Tokenizer'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565 D:\Program Files\python\Lib\site-packages\transformers\tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( then the script closed. I understand this is an error in the source files and did not change something by hand. My pc configuration: AMD RYZEN 5 5600 16GB DDR4 RX580 WIN10 Run everything in PowerShell Ty for ur attention

MartinAbilev commented 2 weeks ago

something similar for me. nor cuda nor cpu works

$ python demo_gr.py --name flux-schnell --device cuda You are using the default legacy behaviour of the <class 'transformers.models.t5 .tokenization_t5.T5Tokenizer'>. This is expected, and simply means that the leg acy (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set legacy=False. This should only be set if you under stand what it means, and thoroughly read the reason why this was added as explai ned in https://github.com/huggingface/transformers/pull/24565 H:\flux.venv\Lib\site-packages\transformers\tokenization_utils_base.py:1601: Fu tureWarning: clean_up_tokenization_spaces was not set. It will be set to True by default. This behavior will be depracted in transformers v4.45, and will be then set to False by default. For more details check this issue: https://gith ub.com/huggingface/transformers/issues/31884 warnings.warn( Segmentation fault (.venv)

MartinAbilev commented 2 weeks ago

code stops when loading t5

t5 = load_t5(device, max_length=256 if is_schnell else 512)