huggingface / diffusers

🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
https://huggingface.co/docs/diffusers
Apache License 2.0
26.16k stars 5.39k forks source link

Auth token is not passable to load_textual_inversion due to named argument conflicting with accepted kwargs #7689

Open Teriks opened 7 months ago

Teriks commented 7 months ago

Describe the bug

You cannot provide an auth token value to the **kwargs of load_textual_inversion in src/diffusers/loaders/textual_inversion.py due to it being a named argument used for another purpose.

It seems this is due to a refactor at some point.

Since the named argument token is already used, you cannot pass an authentication key to this function.

The doc string has two entries for this argument:

token (`str` or *bool*, *optional*):
    The token to use as HTTP bearer authorization for remote files. If `True`, the token generated from
    `diffusers-cli login` (stored in `~/.huggingface`) is used.

token (`str` or `List[str]`, *optional*):
    Override the token to use for the textual inversion weights. If `pretrained_model_name_or_path` is a
    list, then `token` must also be a list of equal length.

Reproduction

N/A

Logs

No response

System Info

diffusers 0.27.2

Who can help?

No response

yiyixuxu commented 7 months ago

Can you provide a reproducible script?

Teriks commented 6 months ago

See also this line https://github.com/huggingface/diffusers/blob/cf6e0407e051467b480830d3ed97d2873b5019d3/src/diffusers/loaders/textual_inversion.py#L44

Here is code which tries to describe the problem

from diffusers import StableDiffusionPipeline
import torch

model_id = "runwayml/stable-diffusion-v1-5"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16).to("cuda")

# How can you pass a User Auth Token to load_textual_inversion for private repositories?

# once upon a time, **kwargs accepted the argument "use_auth_token"
# which was refactored to "token" within load_textual_inversion_state_dicts
# and elsewhere within the library

# see: diffusers\loaders\textual_inversion.py, line: 43

# now an auth token cannot be passed, only the "token" that
# is used to specify a textual inversion from the prompt can be passed

# it is not possible to manually specify credentials, however the environmental
# variable HF_TOKEN is *probably* still respected

# The documentation of this function erroneously contains two references 
# to the parameter "token" one being for the purpose of specifying the textual
# inversion token, and the other being for specifying an auth token

# which is not possible, as both arguments have the same name, and the
# prompt token takes priority due to being a named argument

pipe.load_textual_inversion("sd-concepts-library/cat-toy",
                            token="not-an-auth-token")

prompt = "A <not-an-auth-token> backpack"

image = pipe(prompt, num_inference_steps=50).images[0]
image.save("cat-backpack.png")

Potential workaround:

import os

from diffusers import StableDiffusionPipeline
import torch

model_id = "runwayml/stable-diffusion-v1-5"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16).to("cuda")

# set an HF_TOKEN value here that may differ from what is in the environment initially
use_auth_token = None

old_token = os.environ.get('HF_TOKEN', None)
if use_auth_token is not None:
    os.environ['HF_TOKEN'] = use_auth_token
try:
    pipe.load_textual_inversion("sd-concepts-library/cat-toy",
                                token="not-an-auth-token")
finally:
    if old_token is not None:
        os.environ['HF_TOKEN'] = old_token

prompt = "A <not-an-auth-token> backpack"

image = pipe(prompt, num_inference_steps=50).images[0]
image.save("cat-backpack.png")
github-actions[bot] commented 1 month ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.