The language I am attempting to summarize for is English.
Simply attempting to run a command like
python run_summarization.py --documents_dir ../../../../test-summaries/ --no_cuda true --min_length 50 --max_length 200 --alpha 0.95
fails with the following error:
Traceback (most recent call last):
File "run_summarization.py", line 15, in <module>
from .utils_summarization import (
ModuleNotFoundError: No module named '__main__.utils_summarization'; '__main__' is not a package
I thought the import line was strange and changed it to from utils_summarization import ( (note that I removed the . which preceded utils_summarization. This seemed to fix the error, although I am unsure if it is the correct fix.
Nevertheless, even with this temporary fix that I made, the run_summarization.py script fails with the following error:
INFO:filelock:Lock 140401652398456 acquired on /home/nikhil/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084.lock
INFO:transformers.file_utils:https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt not found in cache or force_download set to True, downloading to /home/nikhil/.cache/torch/transformers/tmpjgcj6x3w
Downloading: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 232k/232k [00:00<00:00, 919kB/s]
INFO:transformers.file_utils:storing https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt in cache at /home/nikhil/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
INFO:transformers.file_utils:creating metadata file for /home/nikhil/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
INFO:filelock:Lock 140401652398456 released on /home/nikhil/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084.lock
INFO:transformers.tokenization_utils_base:loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /home/nikhil/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
Traceback (most recent call last):
File "/home/nikhil/.pyenv/versions/huggingface/lib/python3.6/site-packages/transformers/configuration_utils.py", line 243, in get_config_dict
raise EnvironmentError
OSError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "run_summarization.py", line 324, in <module>
main()
File "run_summarization.py", line 309, in main
evaluate(args)
File "run_summarization.py", line 33, in evaluate
model = BertAbs.from_pretrained("bertabs-finetuned-cnndm")
File "/home/nikhil/.pyenv/versions/huggingface/lib/python3.6/site-packages/transformers/modeling_utils.py", line 602, in from_pretrained
**kwargs,
File "/home/nikhil/.pyenv/versions/huggingface/lib/python3.6/site-packages/transformers/configuration_utils.py", line 201, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/nikhil/.pyenv/versions/huggingface/lib/python3.6/site-packages/transformers/configuration_utils.py", line 252, in get_config_dict
raise EnvironmentError(msg)
OSError: Can't load config for 'bertabs-finetuned-cnndm'. Make sure that:
- 'bertabs-finetuned-cnndm' is a correct model identifier listed on 'https://huggingface.co/models'
- or 'bertabs-finetuned-cnndm' is the correct path to a directory containing a config.json file
Based on the error message, I looked up bertabs-finetuned-cnndm on https://huggingface.co/models to find that there is no exact match for this model name. The closest match is called remi/bertabs-finetuned-cnndm-extractive-abstractive-summarization. Should the script be updated to include this model name instead?
Environment info
Output of transformers-cli env:
- `transformers` version: 2.11.0
- Platform: Linux-4.4.0-18362-Microsoft-x86_64-with-debian-bullseye-sid
- Python version: 3.6.9
- PyTorch version (GPU?): 1.5.1+cpu (False)
- Tensorflow version (GPU?): 2.2.0 (False)
- Using GPU in script?: No
- Using distributed or parallel set-up in script?: No
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
π Bug
Information
Attempting to use BertAbs with the official example script for summarization: https://github.com/huggingface/transformers/tree/master/examples/summarization/bertabs#summarize-any-text
The language I am attempting to summarize for is English.
Simply attempting to run a command like
python run_summarization.py --documents_dir ../../../../test-summaries/ --no_cuda true --min_length 50 --max_length 200 --alpha 0.95
fails with the following error:I thought the import line was strange and changed it to
from utils_summarization import (
(note that I removed the.
which precededutils_summarization
. This seemed to fix the error, although I am unsure if it is the correct fix.Nevertheless, even with this temporary fix that I made, the
run_summarization.py
script fails with the following error:Based on the error message, I looked up
bertabs-finetuned-cnndm
on https://huggingface.co/models to find that there is no exact match for this model name. The closest match is calledremi/bertabs-finetuned-cnndm-extractive-abstractive-summarization
. Should the script be updated to include this model name instead?Environment info
Output of
transformers-cli env
: