InternLM / InternLM-XComposer

InternLM-XComposer-2.5: A Versatile Large Vision Language Model Supporting Long-Contextual Input and Output
2.14k stars 133 forks source link

Batch Inference with XComposer2 #171

Closed boofarboofar closed 4 months ago

boofarboofar commented 4 months ago

Hello!

Are there any examples on asking multiple questions of the same image via batch? Easy enough to do this serially, but slow. I see there is ShareCaptioner, but looking for this for XComposer2.

bingwork commented 4 months ago

I am referring to https://github.com/InternLM/InternLM-XComposer/blob/main/evaluation/mmbench/utils.py#L110 to rewrite it. There are still some problems at present. It would be better if the official could provide a script. @myownskyW7

panzhang0212 commented 4 months ago

We have supported Batch Inference with XComposer2, pleaser refer to https://github.com/InternLM/InternLM-XComposer/blob/main/examples/batch_chat.py

Free free to reopen this issue if you have any problems.