HaozheZhao / MIC

MMICL, a state-of-the-art VLM with the in context learning ability from ICL, PKU
320 stars 15 forks source link

multiple gpus loading #21

Open hitxujian opened 10 months ago

hitxujian commented 10 months ago

is support multiple gpus inference, why not suport load model with device_map='auto',

HaozheZhao commented 10 months ago

I'm sorry, but your question is a bit confusing to me : ( Could you please provide more details about your problem or the function you want to solve? It would be helpful if you could also provide some related code.

hitxujian commented 10 months ago

model = InstructBlipForConditionalGeneration.from_pretrained( model_ckpt,device_map='auto', config=config,**{"torch_dtype": torch.bfloat16})

ValueError: InstructBlipForConditionalGeneration does not support device_map='auto'. To implement support, the modelclass needs to implement the _no_split_modules attribute.

lucasmgomez commented 5 months ago

I also need to use multiple gpu's to load the model. Is that supported?