rinongal / textual_inversion

MIT License
2.9k stars 279 forks source link

Deforum-Stable-Diffusion Inference? #130

Open AIManifest opened 1 year ago

AIManifest commented 1 year ago

Hi!

Aspiring coder/code-hacker/novice/enthusiast here, been searching for months for a method to load textual inversion .pt/.bin to Stable Diffusion checkpoints. I've tried many avenues, including Automatic1111, diffusers, and now, this implementation to load the embeddings to the checkpoint, but I believe I am missing key formulas and values that aren't allowing me to succeed in this task. I'm asking for assistance with this. Full transparency, I want to be able to use textual inversion with Deforum's notebook. I've used this method

`custom_config_path = "/content/textual_inversion/configs/stable-diffusion/v1-inference.yaml" #@param {type:"string"} custom_checkpoint_path = "/content/drive/MyDrive/AI/models/v1-5-pruned.ckpt" #@param {type:"string"} ckpt_dir = "/content/drive/MyDrive/AI/models/stable-diffusion-2-1" embedding_dir = "/content/drive/MyDrive/sd/stable-diffusion-webui/embeddings/LysergianDreams-3600.pt"

config = OmegaConf.load(f"{custom_config_path}") model = load_model_from_config(config, f"{custom_checkpoint_path}") embedding = torch.load(embedding_dir, map_location="cpu") model.embedding_manager.load(embedding)

device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu") model = model.to(device)`

The error I'm getting is:

AttributeError: 'dict' object has no attribute 'seek'. You can only torch.load from a file that is seekable. Please pre-load the data into a buffer like io.BytesIO and try to load from it instead

I understand the attribute error, but I don't understand why it's giving the error if I'm trying to load the embedding. Which is why I believe I'm misunderstanding something or missing important values.

I'm looking for assistance, detailed or not, with getting this method to work with the Deforum notebook for stablediffusion animations. Any insight will be greatly appreciated. Thank you in advance for your time and the work you contribute to the community!