oneir0mancer / stable-diffusion-diffusers-colab-ui

Run stable diffusion in Colab with UI on IPython widgets (no gradio)
MIT License
18 stars 5 forks source link

I can't load embeddings #13

Open ghost opened 5 months ago

ghost commented 5 months ago

ValueError Traceback (most recent call last) /content/StableDiffusionUi/ColabUI/TextualInversionChoice.py in on_load_clicked(b) 20 self.out.clear_output() 21 with self.out: ---> 22 self.load(self.colab.pipe) 23 self.load_button.on_click(on_load_clicked) 24

4 frames /content/StableDiffusionUi/ColabUI/TextualInversionChoice.py in load(self, pipe) 31 if os.path.isfile(self.path.value): 32 dir, filename = os.path.split(self.path.value) ---> 33 self.load_textual_inversion(self.colab.pipe, dir, filename) 34 elif os.path.isdir(self.path.value): 35 self.load_textual_inversions_from_folder(self.colab.pipe, self.path.value)

/content/StableDiffusionUi/ColabUI/TextualInversionChoice.py in __load_textual_inversion(self, pipe, path, filename) 37 #TODO load from file: https://huggingface.co/docs/diffusers/api/loaders/textual_inversion 38 def __load_textual_inversion(self, pipe, path: str, filename: str): ---> 39 self.colab.pipe.load_textual_inversion(path, weight_name=filename) 40 41 token = f"<{os.path.splitext(filename)[0]}>"

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py in _inner_fn(*args, *kwargs) 116 kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=has_token, kwargs=kwargs) 117 --> 118 return fn(args, **kwargs) 119 120 return _inner_fn # type: ignore

/usr/local/lib/python3.10/dist-packages/diffusers/loaders/textual_inversion.py in load_textual_inversion(self, pretrained_model_name_or_path, token, tokenizer, text_encoder, **kwargs) 400 401 # 4. Retrieve tokens and embeddings --> 402 tokens, embeddings = self._retrieve_tokens_and_embeddings(tokens, state_dicts, tokenizer) 403 404 # 5. Extend tokens and embeddings for multi vector

/usr/local/lib/python3.10/dist-packages/diffusers/loaders/textual_inversion.py in _retrieve_tokens_and_embeddings(tokens, state_dicts, tokenizer) 227 228 if token in tokenizer.get_vocab(): --> 229 raise ValueError( 230 f"Token {token} already in tokenizer vocabulary. Please choose a different token name or remove {token} and embedding from the tokenizer and text encoder." 231 )

ValueError: Token emb_params already in tokenizer vocabulary. Please choose a different token name or remove emb_params and embedding from the tokenizer and text encoder.

oneir0mancer commented 5 months ago

ValueError: Token emb_params already in tokenizer vocabulary

This means you are trying to load an embedding that uses the same tokens as some other embedding. Maybe you are trying to load the same embedding twice, or two versions of the embedding (I've seem this with EasyNegative and EasyNegativeV2).

You can try to load pipeline again and then load only the embedding you want first.

ghost commented 5 months ago

This doesn't help