microsoft / DeBERTa

The implementation of DeBERTa
MIT License
1.91k stars 216 forks source link

how to solve this bug #88

Closed tjshu closed 2 years ago

tjshu commented 2 years ago

In this commond python3 -m DeBERTa.apps.run --task_name $task --do_train \ --data_dir $cache_dir/glue_tasks/$task \ --eval_batch_size 128 \ --predict_batch_size 128 \ --output_dir $OUTPUT \ --scale_steps 250 \ --loss_scale 16384 \ --accumulative_update 1 \
--num_train_epochs 6 \ --warmup 100 \ --learning_rate 2e-5 \ --train_batch_size 32 \ --max_seq_len 128 Traceback (most recent call last): File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 308, in _check_seekable f.seek(f.tell()) AttributeError: 'NoneType' object has no attribute 'seek' During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/root/DeBERTa/DeBERTa/apps/run.py", line 431, in main(args) File "/root/DeBERTa/DeBERTa/apps/run.py", line 243, in main tokenizer = tokenizersvocab_type File "/root/DeBERTa/DeBERTa/deberta/gpt2_tokenizer.py", line 62, in init self.gpt2_encoder = torch.load(vocab_file) File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 594, in load with _open_file_like(f, 'rb') as opened_file: File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 235, in _open_file_like return _open_buffer_reader(name_or_buffer) File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 220, in init _check_seekable(buffer) File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 311, in _check_seekable raise_err_msg(["seek", "tell"], e) File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 304, in raise_err_msg raise type(e)(msg) AttributeError: 'NoneType' object has no attribute 'seek'. You can only torch.load from a file that is seekable. Please pre-load the data into a buffer like io.BytesIO and try to load from it instead.

I guass the problem is about data? I had checked that the data is not empty,but I can not sure what happened in the data I think anthor promble is in vocab_file Thank

tjshu commented 2 years ago

sorry in this issues the problem would be solve https://github.com/microsoft/DeBERTa/issues/63