Open VincentVanNF opened 1 week ago
Sure, I will provide it in the next few days.
Sure, I will provide it in the next few days.
Any update?
@VincentVanNF Sorry to my delay. Could you just change "--model_name" parameter in the bash script to your local directory path to enable running locally? I tested from my side and it works.
In our code, the processor loading could also download from HF, I guess it might be the reason. processor = AutoProcessor.from_pretrained( model_args.model_name, trust_remote_code=True, num_crops=model_args.num_crops, )
Please let me know whether it works!
@VincentVanNF Sorry to my delay. Could you just change "--model_name" parameter in the bash script to your local directory path to enable running locally? I tested from my side and it works.
In our code, the processor loading could also download from HF, I guess it might be the reason. processor = AutoProcessor.from_pretrained( model_args.model_name, trust_remote_code=True, num_crops=model_args.num_crops, )
Please let me know whether it works!
Simply modifying this is not sufficient because the corresponding modeling_phi3_v.py
and configuration_phi3_v.py
files are missing in MMEB.fullmodel.bs2048. I ran the model loading code in an environment with internet access and found these two files in the ~/.cache
directory. These two files need to be placed into the checkpoints dir, and the auto_map
value in config.json
needs to be modified to point to the local file paths instead of the remote repository.
Thank you for letting me know! I hadn’t realized these files were missing on HF. I’ll upload them now.
Hello,due to the network environment requirements of the development machine, it cannot access huggingface.co to download and load the model. I have downloaded the VLM2Vec-full model from huggingface.co and saved it on the development machine. I hope to load the model locally.
Here is my code:
However, it seems that this approach still attempts to download the model from huggingface. Could you please provide a demo for loading the model locally? Thank you very much!