google / prompt-to-prompt

Apache License 2.0
2.98k stars 279 forks source link

TypeError: getattr(): attribute name must be string from "null_text_w_ptp.ipynb" file #42

Open jakeyahn opened 1 year ago

jakeyahn commented 1 year ago

I am trying to run the jupyter file and third block give me the following error.

scheduler = DDIMScheduler(beta_start=0.00085, beta_end=0.012, beta_schedule="scaled_linear", clip_sample=False, set_alpha_to_one=False) MY_TOKEN = '' LOW_RESOURCE = False NUM_DDIM_STEPS = 50 GUIDANCE_SCALE = 7.5 MAX_NUM_WORDS = 77 device = torch.device('cuda:0') if torch.cuda.is_available() else torch.device('cpu') ldm_stable = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", use_auth_token=MY_TOKEN, scheduler=scheduler).to(device) try: ldm_stable.disable_xformers_memory_efficient_attention() except AttributeError: print("Attribute disable_xformers_memory_efficient_attention() is missing") tokenizer = ldm_stable.tokenizer


TypeError Traceback (most recent call last) Cell In[3], line 8 6 MAX_NUM_WORDS = 77 7 device = torch.device('cuda:0') if torch.cuda.is_available() else torch.device('cpu') ----> 8 ldm_stable = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", use_auth_token=MY_TOKEN, scheduler=scheduler).to(device) 9 try: 10 ldm_stable.disable_xformers_memory_efficient_attention()

File ~/anaconda3/envs/p2p/lib/python3.8/site-packages/diffusers/pipeline_utils.py:373, in DiffusionPipeline.from_pretrained(cls, pretrained_model_name_or_path, **kwargs) 370 if issubclass(class_obj, class_candidate): 371 load_method_name = importable_classes[class_name][1] --> 373 load_method = getattr(class_obj, load_method_name) 375 loading_kwargs = {} 376 if issubclass(class_obj, torch.nn.Module):

TypeError: getattr(): attribute name must be string

any comments?

All the other jupyter file works well.

somuchtome commented 1 year ago

You can try pip install diffusers==0.10.0 rather than 0.3.0

moonnnpie commented 1 year ago

the problem is solved but it raise a new one

TypeError: PNDMScheduler.set_timesteps() got an unexpected keyword argument 'offset'

how should i fix it

zed1025 commented 1 year ago

It was working fine with diffusers==0.3.0 uptil recently

It's not working now, any potential fix?

lindapu-1 commented 1 year ago

the problem is solved but it raise a new one

TypeError: PNDMScheduler.set_timesteps() got an unexpected keyword argument 'offset'

how should i fix it

Yes i got the same issue. neither diffusers 0.3.0 or 0.10.0 could work

XuejiFang commented 1 year ago

the problem is solved but it raise a new one

TypeError: PNDMScheduler.set_timesteps() got an unexpected keyword argument 'offset'

how should i fix it

Just omit line 163 and revise model.scheduler.set_timesteps in ptp_utils.py

    # set timesteps
    # extra_set_kwargs = {"offset": 1}
    # model.scheduler.set_timesteps(num_inference_steps, **extra_set_kwargs)
    model.scheduler.set_timesteps(num_inference_steps)
yuanzhi-zhu commented 11 months ago

just try this https://github.com/google/prompt-to-prompt/issues/29#issuecomment-1398825757