Open oftenliu opened 10 months ago
Try local_files_only = False
it is not useful, even when set it with True @Paper99
Hi @oftenliu, can you paste the full error information?
information for u:
python3 style_demo.py File "style_demo.py", line 44 weight_name=os.path.basename(photomaker_path), ^ SyntaxError: invalid syntax root@080d42d4fb48:/data/lqc/makephoto# python3 style_demo.py Traceback (most recent call last): File "/usr/local/lib/python3.8/dist-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py", line 1701, in download_from_original_stable_diffusion_ckpt tokenizer = CLIPTokenizer.from_pretrained( File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 2012, in from_pretrained raise EnvironmentError( OSError: Can't load tokenizer for 'openai/clip-vit-large-patch14'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'openai/clip-vit-large-patch14' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "style_demo.py", line 34, in
I have encountered the same problem. Have you resolved it?
Hey, I am facing the same problem. Did someone solve this?
I am downloading clip-vit-large-patch14 now, don't know if this can solve the problem. If you have any solutions, please inform me. thank you.
---Original--- From: "Saket @.> Date: Wed, Mar 27, 2024 14:27 PM To: @.>; Cc: @.**@.>; Subject: Re: [TencentARC/PhotoMaker] ValueError: With local_files_only set toNone, you must first locally save the tokenizer in the following path:'openai/clip-vit-large-patch14'. (Issue #72)
Hey, I am facing the same problem. Did someone solve this?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
I am downloading clip-vit-large-patch14 now, don't know if this can solve the problem. If you have any solutions, please inform me. thank you. … ---Original--- From: "Saket @.> Date: Wed, Mar 27, 2024 14:27 PM To: @.>; Cc: @.**@.>; Subject: Re: [TencentARC/PhotoMaker] ValueError: With local_files_only set toNone, you must first locally save the tokenizer in the following path:'openai/clip-vit-large-patch14'. (Issue #72) Hey, I am facing the same problem. Did someone solve this? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
Yeah I got it running now. Turns out some of the libraries were incomptaible. Re-installed everything with the requirements.txt file I had and then some libraries manually then it worked. Didn't need to download the clip-vit-large-patch14 or anything else.
Can you connect huggingface?
---Original--- From: "Saket @.> Date: Wed, Mar 27, 2024 15:59 PM To: @.>; Cc: @.**@.>; Subject: Re: [TencentARC/PhotoMaker] ValueError: With local_files_only set toNone, you must first locally save the tokenizer in the following path:'openai/clip-vit-large-patch14'. (Issue #72)
I am downloading clip-vit-large-patch14 now, don't know if this can solve the problem. If you have any solutions, please inform me. thank you. … ---Original--- From: "Saket @.> Date: Wed, Mar 27, 2024 14:27 PM To: @.>; Cc: @.@.>; Subject: Re: [TencentARC/PhotoMaker] ValueError: With local_files_only set toNone, you must first locally save the tokenizer in the following path:'openai/clip-vit-large-patch14'. (Issue #72) Hey, I am facing the same problem. Did someone solve this? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
Yeah I got it running now. Turns out some of the libraries were incomptaible. Re-installed everything with the requirements.txt file I had and then some libraries manually then it worked. Didn't need to download the clip-vit-large-patch14 or anything else.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
error happens,like this: ValueError: With local_files_only set to None, you must first locally save the tokenizer in the following path: 'openai/clip-vit-large-patch14'.
i try to run the demo in localhost
@Paper99 help