Open volcverse opened 3 months ago
same
I refer to the example_code/example_chat.py to run the newest InternLM-XComposer-2.5 model using 4 A800 GPUs. But still meet the OOM problem.
same question
Please try to install transformers==4.33.1 with the following command and try again:
pip install transformers==4.33.1
Hello, thanks for the great work!
I refer to the
example_code/example_chat.py
to run the newest InternLM-XComposer-2.5 model using 4 NVIDIA 4090 GPUs. But still meet the OOM problem. It seems that although the weights are divided successfully, the first gpu always runs into OOM whenmodel.chat
is called.Any response will be greatly appreciated!
I found the model cannot take multiple images as inputs, neither can it take a list of images thus the fix is
image = './examples/dubai.png'
query = '<ImageHere>Please describe this image'
Please try to install transformers==4.33.1 with the following command and try again:
pip install transformers==4.33.1
Still meet the same problem with transformers 4.33.1. I'm running the video understanding example on the huggingface. Response will be greatly appreciated
Hello, thanks for the great work!
I refer to the
example_code/example_chat.py
to run the newest InternLM-XComposer-2.5 model using 4 NVIDIA 4090 GPUs. But still meet the OOM problem. It seems that although the weights are divided successfully, the first gpu always runs into OOM whenmodel.chat
is called.Any response will be greatly appreciated!