Closed minjung98 closed 5 months ago
You need to manually download Vicuna 7b v1.1 to that folder:
[cd to root of this project]
git clone https://huggingface.co/lmsys/vicuna-7b-v1.1 ./llm/vicuna-7b
This requires git LFS.
Note: LAVIS is also able to fetch lmsys/vicuna-7b-v1.1 directly from HF hub, probably worthwhile to implement that.
Thank you for taking the time. As you instructed, I have manually downloaded the model. However, a new error has occurred now. It seems to be related to the tokenizer. Do you know any solutions? Thank you once again for your prompt response. Below is the error I encountered.
RuntimeError: Internal: could not parse ModelProto from llm/vicuna-7b/tokenizer.model
Hi, did you follow the instruction from https://github.com/salesforce/LAVIS/tree/main/projects/instructblip to download the vicuna-7b v1.1 and apply the delta weights to the original LLaMA weights? Or according to this issue, you can directly download from this link.
Thank you for your help. However, I still encounter the following error every time I run the command:
python3 -m fastchat.model.apply_delta \ --base-model-path /home/minjung/MA-LMM/llm/vicuna-7b\ --target-model-path /home/minjung/MA-LMM/llm_output/vicuna-7b\ --delta-path lmsys/vicuna-7b-delta-v1.1
The error is:
OSError: You seem to have cloned a repository without having git-lfs installed. Please install git-lfs and run 'git lfs install' followed by 'git lfs pull' in the folder you cloned.
I have followed the instructions to run git lfs install and git lfs pull, but I keep getting the same error.
I also have another question: Should I clone the repository https://github.com/lm-sys/FastChat.git inside the MA-LMM folder or outside? Thank you once again for your kind response.
When I run the following section in 'demo.ipynb':
model, vis_processors, _ = load_model_and_preprocess( name="blip2_vicuna_instruct_malmm", model_type="vicuna7b", is_eval=True, device=device )
I encounter this error:
"llm/vicuna-7b is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token="
How can I resolve this issue?