Open gehong-coder opened 5 months ago
model, tokenizer = get_model_tokenizer(model_type, torch.float16,load_in_4bit=True,model_kwargs={'device_map': 'auto'},use_unsloth=True NotImplementedError: Unsloth: .cache/modelscope/hub/ZhipuAI/cogvlm2-llama3-chinese-chat-19B not supported yet!
Wait is this a vision model?
model, tokenizer = get_model_tokenizer(model_type, torch.float16,load_in_4bit=True,model_kwargs={'device_map': 'auto'},use_unsloth=True NotImplementedError: Unsloth: .cache/modelscope/hub/ZhipuAI/cogvlm2-llama3-chinese-chat-19B not supported yet!