Open justinphan3110cais opened 1 month ago
Hi @justinphan3110cais , I have fixed the error. You can pull the main again and try the same command.
@kcz358 , Thanks, I saw your fix. However, I think a more robust fix is we can check for the architectures
in the AutoConfig
? For example current fix will fail if I have my local version of the llava-hf-1.6 or some model trained from llava-hf-1.6?
I think both are kind of a hard code fix but maybe checking architectures are more flexible. For now I think you just need to make sure your pretrained path contains 1.6
or 1.5
to separate different versions of llava-hf. And I think the way I parse the results also have some problem. Maybe I will try to address these issues later but if you already have your own fix on it you are welcome to raise a PR.
Hi,
I'm trying to run the lmms-eval on llava-hf/llava-v1.6-x and receive an error:
tracing back I think it is because the
llava-hf/llava-v1.6
useLlavaNextForConditionalGeneration
andLlavaNextProcessor
which does not haveimage_sizes
Here is the command that I used: