Open IssamLaradji opened 1 year ago
The BPE tokenizer is a built-in API in transformers. I think there might be come conflicts with the module versions. You can check you environment with our requirements
Hi again,
I ran this command in Section 2.2 and I get the error in the picture below. Could you please advise?
python preprocess.py \ --split dev \ --file_name exeds_dev.json \ --do_fairseq_tokenization \ --do_gptneo \ --token_type token \ --context_range 3 \ --max_code_cell_tokens 200 \ --max_md_cell_tokens 200 \ --max_ctx_cell_tokens 900
hi, do you solve this problem? I also meet the same error and I have checked the requirement
Hi again,
I ran this command in Section 2.2 and I get the error in the picture below. Could you please advise?