Open bat1115 opened 2 months ago
Are the Denoised Unet of LDM and BERTEmbedder all trained? What is the use of configure_optimizers?
No, both the Denoised Unet of LDM and BERTEmbedder are not trained. The codebase is borrowed from textual inversion, so there are some other options in configure_optimizers. But only the first if branch is valid in our code.
No, both the Denoised Unet of LDM and BERTEmbedder are not trained. The codebase is borrowed from textual inversion, so there are some other options in configure_optimizers. But only the first if branch is valid in our code.
What about configure_opt_embedding( ) and configure_opt_model( )? Are they run in the training process?
No, both the Denoised Unet of LDM and BERTEmbedder are not trained. The codebase is borrowed from textual inversion, so there are some other options in configure_optimizers. But only the first if branch is valid in our code.
What about configure_opt_embedding( ) and configure_opt_model( )? Are they run in the training process?
My question is how does BERTEmbedder guarantee the effect of z,position = self.transformer(tokens, return_embeddings=True, **kwargs) if the parameters are not updated? Because self.transformer = TransformerWrapper() is not a pre-trained model.