Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
When running TransformerSum with PyTorch 1.10.2, the training script fails with
pytorch_lightning.utilities.exceptions.MisconfigurationException: You have asked for `amp_level='O1'` but it's only supported with `amp_backend='apex'`.
One of the GitHub issues has mentioned this problem, and the solution was to simply remove the argument. Instead, I think that a choice for the AMP backend should be added to the list of parameters of main.py.
In that case, the default value of --amp_level should be set to None, as PyTorch's native AMP backend does not support --amp_level='O1'.
When running TransformerSum with PyTorch 1.10.2, the training script fails with
pytorch_lightning.utilities.exceptions.MisconfigurationException: You have asked for `amp_level='O1'` but it's only supported with `amp_backend='apex'`.
One of the GitHub issues has mentioned this problem, and the solution was to simply remove the argument. Instead, I think that a choice for the AMP backend should be added to the list of parameters of
main.py
. In that case, the default value of--amp_level
should be set toNone
, as PyTorch's native AMP backend does not support--amp_level='O1'
.