keras-team / keras-nlp

Modular Natural Language Processing workflows with Keras
Apache License 2.0
740 stars 218 forks source link

Fix saving bug for untied weights with keras 3.2 #1568

Closed mattdangerw closed 3 months ago

mattdangerw commented 3 months ago

Tricky bug, keras.layers.Embedding has started overriding save_own_variables/load_own_variables for quantization. We can no longer rely on absolute indexing for the reverse embedding, e.g. self.reverse_embeddings.assign(store["1"])

I also couldn't figure out a way to continue to support loading tied weights untied. This is because the embedding will now unconditionally error if weight counts are mismatched here: https://github.com/keras-team/keras/blob/v3.2.0/keras/layers/core/embedding.py#L232-L264

I don't think we have a direct need for this functionality today, so I just removed it, but could be worth considering how to bring back in the future.