OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
Apache License 2.0
7.86k stars 547 forks source link

what does --ckpt mean in eval script? #251

Closed hill2hill closed 3 weeks ago

hill2hill commented 3 weeks ago
# model name and model path
--model_name
--model_path
# load model from ckpt
--ckpt

when should i set this param? its default value is "", but code only check whether it is None.

hill2hill commented 3 weeks ago

In my case, I just modified the --ckpt to be None. And it looks like working fine, i can reproduce the textVQA result with run_inference.sh. One thing might be helpful when somebody meet the error like: TypeError: MiniCPMV.chat() missing 1 required positional argument: 'image'. You can add one line arg like:

res = self.model.chat(
            image=image,
            msgs=msgs,
            context=None,
            tokenizer=self.tokenizer,
            **default_kwargs
        )

if using --generate_method interleave in run_inference.sh

Cuiunbo commented 3 weeks ago

thanks~