Lightning-Universe / lightning-transformers

Flexible components pairing 🤗 Transformers with :zap: Pytorch Lightning
https://lightning-transformers.readthedocs.io
Apache License 2.0
607 stars 77 forks source link

Lightning 1.8 breaks lightning-transformers #300

Closed juliusfrost closed 1 year ago

juliusfrost commented 1 year ago

🐛 Bug

Errors: LightningDeprecationWarning: LightningDataModule.on_save_checkpoint was deprecated in v1.6 and will be removed in v1.8. Use state_dict instead. LightningDeprecationWarning: LightningDataModule.on_load_checkpoint was deprecated in v1.6 and will be removed in v1.8. Use load_state_dict instead.

To Reproduce

Steps to reproduce the behavior:

Run any lightning-transformers DataModule in the trainer.

Code sample

These two methods in particular cause the errors in lightning_transformers\core\data.py.

    def on_save_checkpoint(self, checkpoint: Dict[str, Any]) -> None:
        checkpoint["tokenizer"] = self.tokenizer

    def on_load_checkpoint(self, checkpoint: Dict[str, Any]) -> None:
        self.tokenizer = checkpoint["tokenizer"]

Expected behavior

Borda commented 1 year ago

This shall be addressed by ongoing work #297