HHousen / TransformerSum

Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
https://transformersum.rtfd.io
GNU General Public License v3.0
428 stars 58 forks source link

Add support for different AMP backends #64

Closed taiypeo closed 2 years ago

taiypeo commented 2 years ago

When running TransformerSum with PyTorch 1.10.2, the training script fails with pytorch_lightning.utilities.exceptions.MisconfigurationException: You have asked for `amp_level='O1'` but it's only supported with `amp_backend='apex'`.

One of the GitHub issues has mentioned this problem, and the solution was to simply remove the argument. Instead, I think that a choice for the AMP backend should be added to the list of parameters of main.py. In that case, the default value of --amp_level should be set to None, as PyTorch's native AMP backend does not support --amp_level='O1'.