sugyan / stable-diffusion-morphing

17 stars 3 forks source link

Error: the component cannot be loaded, missing method #2

Open jochemstoel opened 1 year ago

jochemstoel commented 1 year ago

Hello, I'm trying to run your code using my own custom finetune of Stable Diffusion (loaded from huggingface as well) but I receive an error. I slightly changed the load model cell to contain my model name like this:

!pip install diffusers[torch]==0.7.2 transformers accelerate scipy ftfy pytorch_lightning

import torch
from diffusers import StableDiffusionPipeline

HF_AUTH_TOKEN = '****************'

def get_device() -> torch.device:
    if torch.cuda.is_available():
        return torch.device("cuda")
    if torch.backends.mps.is_available():
        return torch.device("mps")
    return torch.device("cpu")

device = get_device()
pipe = StableDiffusionPipeline.from_pretrained(
    "jochemstoel/mymodel", torch_dtype=torch.float16, use_auth_token=HF_AUTH_TOKEN
)
pipe = pipe.to(device)

I had to remove revision="fp16" for this finetune. When I run that code, it starts downloading and then an error happens when trying to run _pipe = StableDiffusionPipeline.frompretrained. Here is my stdout log:

Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting diffusers[torch]==0.7.2
  Downloading diffusers-0.7.2-py3-none-any.whl (304 kB)
     |████████████████████████████████| 304 kB 14.6 MB/s 
Collecting transformers
  Downloading transformers-4.25.1-py3-none-any.whl (5.8 MB)
     |████████████████████████████████| 5.8 MB 67.1 MB/s 
Collecting accelerate
  Downloading accelerate-0.15.0-py3-none-any.whl (191 kB)
     |████████████████████████████████| 191 kB 78.5 MB/s 
Requirement already satisfied: scipy in /usr/local/lib/python3.8/dist-packages (1.7.3)
Collecting ftfy
  Downloading ftfy-6.1.1-py3-none-any.whl (53 kB)
     |████████████████████████████████| 53 kB 2.1 MB/s 
Collecting pytorch_lightning
  Downloading pytorch_lightning-1.8.5.post0-py3-none-any.whl (800 kB)
     |████████████████████████████████| 800 kB 72.4 MB/s 
Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.8/dist-packages (from diffusers[torch]==0.7.2) (2022.6.2)
Requirement already satisfied: numpy in /usr/local/lib/python3.8/dist-packages (from diffusers[torch]==0.7.2) (1.21.6)
Requirement already satisfied: filelock in /usr/local/lib/python3.8/dist-packages (from diffusers[torch]==0.7.2) (3.8.2)
Requirement already satisfied: requests in /usr/local/lib/python3.8/dist-packages (from diffusers[torch]==0.7.2) (2.23.0)
Requirement already satisfied: importlib-metadata in /usr/local/lib/python3.8/dist-packages (from diffusers[torch]==0.7.2) (5.1.0)
Requirement already satisfied: huggingface-hub>=0.10.0 in /usr/local/lib/python3.8/dist-packages (from diffusers[torch]==0.7.2) (0.11.1)
Requirement already satisfied: Pillow<10.0 in /usr/local/lib/python3.8/dist-packages (from diffusers[torch]==0.7.2) (7.1.2)
Requirement already satisfied: torch>=1.4 in /usr/local/lib/python3.8/dist-packages (from diffusers[torch]==0.7.2) (1.13.0+cu116)
Requirement already satisfied: psutil in /usr/local/lib/python3.8/dist-packages (from accelerate) (5.4.8)
Requirement already satisfied: pyyaml in /usr/local/lib/python3.8/dist-packages (from accelerate) (6.0)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.8/dist-packages (from accelerate) (21.3)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.8/dist-packages (from huggingface-hub>=0.10.0->diffusers[torch]==0.7.2) (4.4.0)
Requirement already satisfied: tqdm in /usr/local/lib/python3.8/dist-packages (from huggingface-hub>=0.10.0->diffusers[torch]==0.7.2) (4.64.1)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging>=20.0->accelerate) (3.0.9)
Collecting tokenizers!=0.11.3,<0.14,>=0.11.1
  Downloading tokenizers-0.13.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.6 MB)
     |████████████████████████████████| 7.6 MB 62.6 MB/s 
Requirement already satisfied: wcwidth>=0.2.5 in /usr/local/lib/python3.8/dist-packages (from ftfy) (0.2.5)
Collecting tensorboardX>=2.2
  Downloading tensorboardX-2.5.1-py2.py3-none-any.whl (125 kB)
     |████████████████████████████████| 125 kB 84.4 MB/s 
Requirement already satisfied: fsspec[http]>2021.06.0 in /usr/local/lib/python3.8/dist-packages (from pytorch_lightning) (2022.11.0)
Collecting torchmetrics>=0.7.0
  Downloading torchmetrics-0.11.0-py3-none-any.whl (512 kB)
     |████████████████████████████████| 512 kB 85.6 MB/s 
Collecting lightning-utilities!=0.4.0,>=0.3.0
  Downloading lightning_utilities-0.4.2-py3-none-any.whl (16 kB)
Requirement already satisfied: aiohttp!=4.0.0a0,!=4.0.0a1 in /usr/local/lib/python3.8/dist-packages (from fsspec[http]>2021.06.0->pytorch_lightning) (3.8.3)
Requirement already satisfied: charset-normalizer<3.0,>=2.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp!=4.0.0a0,!=4.0.0a1->fsspec[http]>2021.06.0->pytorch_lightning) (2.1.1)
Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp!=4.0.0a0,!=4.0.0a1->fsspec[http]>2021.06.0->pytorch_lightning) (1.8.2)
Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp!=4.0.0a0,!=4.0.0a1->fsspec[http]>2021.06.0->pytorch_lightning) (22.1.0)
Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.8/dist-packages (from aiohttp!=4.0.0a0,!=4.0.0a1->fsspec[http]>2021.06.0->pytorch_lightning) (6.0.3)
Requirement already satisfied: async-timeout<5.0,>=4.0.0a3 in /usr/local/lib/python3.8/dist-packages (from aiohttp!=4.0.0a0,!=4.0.0a1->fsspec[http]>2021.06.0->pytorch_lightning) (4.0.2)
Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.8/dist-packages (from aiohttp!=4.0.0a0,!=4.0.0a1->fsspec[http]>2021.06.0->pytorch_lightning) (1.3.1)
Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from aiohttp!=4.0.0a0,!=4.0.0a1->fsspec[http]>2021.06.0->pytorch_lightning) (1.3.3)
Requirement already satisfied: protobuf<=3.20.1,>=3.8.0 in /usr/local/lib/python3.8/dist-packages (from tensorboardX>=2.2->pytorch_lightning) (3.19.6)
Requirement already satisfied: idna>=2.0 in /usr/local/lib/python3.8/dist-packages (from yarl<2.0,>=1.0->aiohttp!=4.0.0a0,!=4.0.0a1->fsspec[http]>2021.06.0->pytorch_lightning) (2.10)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.8/dist-packages (from importlib-metadata->diffusers[torch]==0.7.2) (3.11.0)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.8/dist-packages (from requests->diffusers[torch]==0.7.2) (1.24.3)
Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.8/dist-packages (from requests->diffusers[torch]==0.7.2) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.8/dist-packages (from requests->diffusers[torch]==0.7.2) (2022.12.7)
Installing collected packages: torchmetrics, tokenizers, tensorboardX, lightning-utilities, diffusers, accelerate, transformers, pytorch-lightning, ftfy
Successfully installed accelerate-0.15.0 diffusers-0.7.2 ftfy-6.1.1 lightning-utilities-0.4.2 pytorch-lightning-1.8.5.post0 tensorboardX-2.5.1 tokenizers-0.13.2 torchmetrics-0.11.0 transformers-4.25.1
Downloading: 100%
543/543 [00:00<00:00, 28.3kB/s]
Fetching 15 files: 100%
15/15 [02:30<00:00, 16.75s/it]
Downloading: 100%
342/342 [00:00<00:00, 21.6kB/s]
Downloading: 100%
4.81k/4.81k [00:00<00:00, 274kB/s]
Downloading: 100%
1.22G/1.22G [00:29<00:00, 40.1MB/s]
Downloading: 100%
308/308 [00:00<00:00, 22.0kB/s]
Downloading: 100%
721/721 [00:00<00:00, 42.7kB/s]
Downloading: 100%
492M/492M [00:12<00:00, 43.1MB/s]
Downloading: 100%
525k/525k [00:00<00:00, 1.23MB/s]
Downloading: 100%
472/472 [00:00<00:00, 29.5kB/s]
Downloading: 100%
912/912 [00:00<00:00, 48.3kB/s]
Downloading: 100%
1.06M/1.06M [00:00<00:00, 2.05MB/s]
Downloading: 100%
963/963 [00:00<00:00, 41.4kB/s]
Downloading: 100%
3.44G/3.44G [01:27<00:00, 40.3MB/s]
Downloading: 100%
699/699 [00:00<00:00, 31.2kB/s]
Downloading: 100%
335M/335M [00:08<00:00, 43.4MB/s]
The config attributes {'dual_cross_attention': False, 'use_linear_projection': False} were passed to UNet2DConditionModel, but are not expected and will be ignored. Please verify your config.json configuration file.
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
[<ipython-input-3-a8879d91b2b9>](https://localhost:8080/#) in <module>
     17 
     18 device = get_device()
---> 19 pipe = StableDiffusionPipeline.from_pretrained(
     20     "jochemstoel/mymodel", torch_dtype=torch.float16, use_auth_token=HF_AUTH_TOKEN
     21 )

[/usr/local/lib/python3.8/dist-packages/diffusers/pipeline_utils.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
    596                         class_obj()
    597 
--> 598                     raise ValueError(
    599                         f"The component {class_obj} of {pipeline_class} cannot be loaded as it does not seem to have"
    600                         f" any of the loading methods defined in {ALL_IMPORTABLE_CLASSES}."

ValueError: The component <class 'transformers.models.clip.image_processing_clip.CLIPImageProcessor'> of <class 'diffusers.pipelines.stable_diffusion.pipeline_stable_diffusion.StableDiffusionPipeline'> cannot be loaded as it does not seem to have any of the loading methods defined in {'ModelMixin': ['save_pretrained', 'from_pretrained'], 'SchedulerMixin': ['save_config', 'from_config'], 'DiffusionPipeline': ['save_pretrained', 'from_pretrained'], 'OnnxRuntimeModel': ['save_pretrained', 'from_pretrained'], 'PreTrainedTokenizer': ['save_pretrained', 'from_pretrained'], 'PreTrainedTokenizerFast': ['save_pretrained', 'from_pretrained'], 'PreTrainedModel': ['save_pretrained', 'from_pretrained'], 'FeatureExtractionMixin': ['save_pretrained', 'from_pretrained']}.

Can you please help me?

sugyan commented 1 year ago

Hmmm. I'm sorry but I don't know how that error occurs. Maybe you can avoid it by changing the diffusers version. have you tried that?

jochemstoel commented 1 year ago

Change it to what?