The scheduler and tokenizer are saved as .zip files containing the directory written by their .save_pretrained() methods
Requires nothing from HuggingFace Hub at deserialization time
Adds S3 upload capability for transformers tokenizers
Adjusts serialization to pass validation checks by using include_non_persistent_buffers=False
Merges in the latest changes from main to support using include_non_persistent_buffers=False correctly
Avoids re-initializing models from HF for no reason
Cleans up misspelled/outdated CLI args and help
Adds parameter weight validation for diffusers models
Refactors serialize_model substantially
Adds logging level command line arguments, and
Shifts more outputs to use a logger
(Outdated): I left in the code to generate a test image through diffusers because it is good example code for this repository of how to re-assemble the components of an SD model and was good for testing that these changes work. It could be commented out or deleted later, or changed to be only enabled through a flag.
(Update): The test image generation code for diffusers is now commented out.
Updated
diffusers
support inhf_serialization.py
More fixes and improvements for #73.
This change:
StableDiffusionPipeline
componentstext_encoder
,vae
(updated),unet
(updated),scheduler
(new),tokenizer
(new)scheduler
andtokenizer
are saved as.zip
files containing the directory written by their.save_pretrained()
methodstransformers
tokenizersinclude_non_persistent_buffers=False
main
to support usinginclude_non_persistent_buffers=False
correctlydiffusers
modelsserialize_model
substantially(Outdated): I left in the code to generate a test image through
diffusers
because it is good example code for this repository of how to re-assemble the components of an SD model and was good for testing that these changes work. It could be commented out or deleted later, or changed to be only enabled through a flag. (Update): The test image generation code for diffusers is now commented out.