Open jjz1011 opened 4 months ago
Thanks for your attention! There seem to be two issues with the code. Firstly, the tokenizer function missing a parameter, causes a warning. I've fixed this by adding parameters.
tokenized_data = preproc_tokenizer(long_train_real) //old
tokenized_data = preproc_tokenizer(long_train_real, truncation=True, max_length=preproc_tokenizer.model_max_length) //now
The second issue
OSError: No such device (os error 19)
might occur if the pre-trained model isn't downloaded, leading to a failure in loading the transformers module. Please follow the steps in readme.md to download and place the pre-trained model in the correct folder.
Token indices sequence length is longer than the specified maximum sequence length for this model (749 > 512). Running this sequence through the model will result in indexing errors Traceback (most recent call last): File "run_meta_mmd_trans.py", line 925, in
base_model, base_tokenizer = load_base_model_and_tokenizer(args.base_model_name)
File "run_meta_mmd_trans.py", line 531, in load_base_model_and_tokenizer
base_model = transformers.AutoModelForCausalLM.from_pretrained(model_path_dit[name])
File "/root/miniconda3/envs/detectGPT/lib/python3.7/site-packages/transformers/models/auto/auto_factory.py", line 485, in from_pretrained
pretrained_model_name_or_path, *model_args, config=config, hub_kwargs, kwargs
File "/root/miniconda3/envs/detectGPT/lib/python3.7/site-packages/transformers/modeling_utils.py", line 2604, in from_pretrained
state_dict = load_state_dict(resolved_archive_file)
File "/root/miniconda3/envs/detectGPT/lib/python3.7/site-packages/transformers/modeling_utils.py", line 450, in load_state_dict
with safe_open(checkpoint_file, framework="pt") as f:
OSError: No such device (os error 19)