I downloaded pyllama. Converted it to huggingface format. and when I run the following command
python3 -m fastchat.model.apply_delta --base converted7B/ --target llama_to_vicuna --delta lmsys/vicuna-7b-delta-v1.1
the following error is raised.
OSError: Can't load tokenizer for 'lmsys/vicuna-7b-delta-v1.1'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'lmsys/vicuna-7b-delta-v1.1' is the correct path to a directory containing all relevant files for a LlamaTokenizer tokenizer.
I downloaded pyllama. Converted it to huggingface format. and when I run the following command
python3 -m fastchat.model.apply_delta --base converted7B/ --target llama_to_vicuna --delta lmsys/vicuna-7b-delta-v1.1
the following error is raised.
OSError: Can't load tokenizer for 'lmsys/vicuna-7b-delta-v1.1'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'lmsys/vicuna-7b-delta-v1.1' is the correct path to a directory containing all relevant files for a LlamaTokenizer tokenizer.
someone please suggest me the workaround