Closed TingNLP closed 3 years ago
The command runs for me and according to your logs, the Trainer
is loading a local checkpoint named roberta-base
. Do you have a local folder named roberta-base
? It looks like it contains a checkpoint different from the actual roberta-base
model, which messes up and creates the error. Could you move that folder and try again?
@sgugger
Yes, I create a local folder named roberta-base
, but the roberta-base
folder contents is download from huggingface
(https://huggingface.co/roberta-base/tree/main)
the language-modeling
folder screenshot as shown below:
the roberta-base
folder screenshot as shown below:
so i am confused...
I think it's linked to the bug #11492 is fixing. Should be merged today and then you can try on a source install!
Environment info
transformers
version: 4.6.0.dev0Who can help
@sgugger
Information
Model I am using roberta:
The problem arises when using:
The tasks I am working on is:
(https://www.salesforce.com/products/einstein/ai-research/the-wikitext-dependency-language-modeling-dataset/)
To reproduce
Steps to reproduce the behavior:
I follow the example https://github.com/huggingface/transformers/tree/master/examples/pytorch/language-modeling
When I run
and the error occurs
Expected behavior
The expected behavior is that I will get a new pretrain language model based on my dataset