I have fine-tuned liuhaotian/llava-v1.5-13b on an OCR task using LoRa. I am now trying to use this model for inference but when I try to merge the LoRa weights it throws a bizarre error both saying that the Llava configs (LlavaConfig, LlavaMptConfig, LlavaMistralConfig) are installed and simultaneously that they don't exist. I am using the latest version of the repo (Commit: c121f04) and the default conda environment, with the only difference being that i installed protobuf because it threw an error if it wasn't installed. I have been able to replicated this across multiple cloud machines.
Describe the issue
Issue:
I have fine-tuned
liuhaotian/llava-v1.5-13b
on an OCR task using LoRa. I am now trying to use this model for inference but when I try to merge the LoRa weights it throws a bizarre error both saying that the Llava configs (LlavaConfig, LlavaMptConfig, LlavaMistralConfig
) are installed and simultaneously that they don't exist. I am using the latest version of the repo (Commit: c121f04) and the default conda environment, with the only difference being that i installedprotobuf
because it threw an error if it wasn't installed. I have been able to replicated this across multiple cloud machines.transformers==4.37.2 and tokenizers==0.15.1
Command:
Log:
Notice that according to the error
LlavaConfig, LlavaMptConfig, LlavaMistralConfig
are installed.