Open xiyanghu opened 1 year ago
Another issue:
It seems like the google drive folder does not contain the tokenizer.json
file?
In addition to the pre-trained model checkpoint, could you also provided a checkpoint of trained retrieval? Thank you!
Sorry for delay, I have updated config.json, and tokenizer.json. Actually it's the same to the XLMR-Large in Hugginface.
When I tried to run the training, it raises an error for missing
model_type
key in config.json.The training script I used is the one in the
run-ccp.sh
:This is the error message: