Open whatisslove11 opened 1 year ago
It could be that you are mixing model between cpu and gpu and bytes dont like that.
Try just with cuda
tokenizer, model, image_processor, context_len = load_pretrained_model( MODEL, None, model_name, True, False, device="cuda", device_map={"": 0} )
Hello! I wanna test LLaVa for auto distillation, but I got this error:
Minimal code for implement the error:
Full error: