/home/qcaiaj/miniconda3/envs/prompt_lm_eval/lib/python3.8/site-packages/transformers/models/t5/tokenization_t5_fast.py:160: FutureWarning: This tokenizer was incorrectly instantiated with a model max length of 512 which will be corrected in Transformers v5.
For now, this behavior is kept to avoid breaking backwards compatibility when padding/encoding with `truncation is True`.
- Be aware that you SHOULD NOT rely on t5-base automatically truncating your input to 512 when padding/encoding.
- If you want to encode/pad to sequences longer than 512 you can either instantiate this tokenizer with `model_max_length` or pass `max_length` when encoding/padding.
- To avoid this warning, please instantiate this tokenizer with `model_max_length` set to your preferred value.
warnings.warn(
» Assigning unique IDs to 'mnli+must be true' docs
» Filtering invalid docs from 'mnli+must be true'
» Constructing 'mnli+must be true' contexts and requests
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 9815/9815 [00:24<00:00, 402.34it/s]
» Running all `loglikelihood` requests
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 921/921 [02:22<00:00, 6.44it/s]
Traceback (most recent call last):
File "main.py", line 215, in <module>
main()
File "main.py", line 197, in main
results = evaluator.cli_evaluate(**evaluate_args)
File "/home/qcaiaj/workspace/NLP/bigscience-workshop/lm-evaluation-harness/lm_eval/evaluator.py", line 90, in cli_evaluate
results = evaluate(
File "/home/qcaiaj/workspace/NLP/bigscience-workshop/lm-evaluation-harness/lm_eval/evaluator.py", line 242, in evaluate
output = task.process_results(doc, per_doc_results)
File "/home/qcaiaj/workspace/NLP/bigscience-workshop/lm-evaluation-harness/lm_eval/api/task.py", line 511, in process_results
assert isinstance(target, list) and len(target) == 1
AssertionError
I ran the following command and got an error. Any suggestions?
And get the error: