Open etetteh opened 3 years ago
were you able to solve this issue?
No, I wasn't able to resolve it
Could you please give more details e.g. config.json
and the exact command for converting (discriminator or generator).
There's one known problem with small generator models (needs a config change).
I think it could be a problem of the config. I was experiencing the same problem, but it was because I was using the small model's config when I was converting the base model. Changed the config and works now.
@stefan-it could you tell me what the known problem with the small generator entails? Does it have to do with the setting of:
self.generator_hidden_size = 0.25 # frac of discrim hidden size for gen
It was related to this configuration change (that is needed to convert the model correctly)
https://github.com/google-research/electra/issues/94#issuecomment-689633064
Getting the following error when converting my ckpt to huggingface's pytorch. I am using the same config file I used for the pretraining.
Also, coverting other ckpt does start at all except the 1M training step, which is also failing here