black-forest-labs / flux

Official inference repo for FLUX.1 models
Apache License 2.0
13.61k stars 962 forks source link

Exception: data did not match any variant of untagged enum PyPreTokenizerTypeWrapper at line 960 column 3 #57

Open AhnLee opened 1 month ago

AhnLee commented 1 month ago

Hi, Thanks for the awesome work.

I wanna run the flux by diffusers code, but I got some tokenizer error like

---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
Input [In [3]](vscode-notebook-cell:?execution_count=3%3E), in <cell line: 7>()
      [3](vscode-notebook-cell:?execution_count=3&line=3) from diffusers import FluxPipeline
      [5](vscode-notebook-cell:?execution_count=3&line=5) model_id = "black-forest-labs/FLUX.1-schnell" #you can also use `black-forest-labs/FLUX.1-dev`
----> [7](vscode-notebook-cell:?execution_count=3&line=7) pipe = FluxPipeline.from_pretrained("black-forest-labs/FLUX.1-schnell", torch_dtype=torch.bfloat16)
      [8](vscode-notebook-cell:?execution_count=3&line=8) pipe.enable_model_cpu_offload() #save some VRAM by offloading the model to CPU. Remove this if you have enough GPU power
     [10](vscode-notebook-cell:?execution_count=3&line=10) prompt = "A cat holding a sign that says hello world"

File [~/anaconda3/envs/vdds/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py:114](https://vscode-remote+ssh-002dremote-002b143-002e248-002e159-002e93.vscode-resource.vscode-cdn.net/home/jhpark/flux/~/anaconda3/envs/vdds/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py:114), in validate_hf_hub_args.<locals>._inner_fn(*args, **kwargs)
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py:1'>1</a>;32m    [111](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py?line=110) if check_use_auth_token:
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py:1'>1</a>;32m    [112](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py?line=111)     kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.__name__, has_token=has_token, kwargs=kwargs)
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py:0'>0</a>;32m--> [114](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py?line=113) return fn(*args, **kwargs)

File [~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:876](https://vscode-remote+ssh-002dremote-002b143-002e248-002e159-002e93.vscode-resource.vscode-cdn.net/home/jhpark/flux/~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:876), in DiffusionPipeline.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [873](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=872)     loaded_sub_model = passed_class_obj[name]
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [874](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=873) else:
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [875](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=874)     # load sub model
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:0'>0</a>;32m--> [876](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=875)     loaded_sub_model = load_sub_model(
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [877](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=876)         library_name=library_name,
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [878](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=877)         class_name=class_name,
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [879](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=878)         importable_classes=importable_classes,
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [880](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=879)         pipelines=pipelines,
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [881](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=880)         is_pipeline_module=is_pipeline_module,
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py:1'>1</a>;32m    [882](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py?line=881)         pipeline_class=pipeline_class,
...
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py:1'>1</a>;32m    [112](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py?line=111) elif slow_tokenizer is not None:
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py:1'>1</a>;32m    [113](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py?line=112)     # We need to convert a slow tokenizer to build the backend
ref='~/anaconda3/envs/vdds/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py:1'>1</a>;32m    [114](file:///home/jhpark/anaconda3/envs/vdds/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py?line=113)     fast_tokenizer = convert_slow_tokenizer(slow_tokenizer)

Exception: data did not match any variant of untagged enum PyPreTokenizerTypeWrapper at line 960 column 3

My tokenizers version is 0.11.1 and I failed to run diffuser flux code. And I tried to run the diffusers flux code with tokenizers with 0.13.3 but also failed.

How can I fix this error?

Lavrikov commented 1 month ago

Faced same problem at first run. Try: !pip install --upgrade accelerate !pip install --upgrade transformers

following version output works for me: Downloading transformers-4.44.0-py3-none-any.whl (9.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.5/9.5 MB 108.4 MB/s eta 0:00:0000:010:01 Downloading safetensors-0.4.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (435 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 435.4/435.4 kB 86.9 MB/s eta 0:00:00 Downloading tokenizers-0.19.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.6/3.6 MB 135.2 MB/s eta 0:00:00