InternLM / InternLM-XComposer

InternLM-XComposer-2.5: A Versatile Large Vision Language Model Supporting Long-Contextual Input and Output
2.14k stars 133 forks source link

The code in https://huggingface.co/internlm/internlm-xcomposer2-7b cannot be run successfully. RuntimeError occurs: "compute_indices_weights_cubic" not implemented for 'Half' #185

Closed zhulinJulia24 closed 4 months ago

zhulinJulia24 commented 4 months ago
(lmdeployv25) [zhulin1@SH-IDC1-10-140-0-187 InternLMLin]$ python
Python 3.10.0 (default, Mar  3 2022, 09:58:08) [GCC 7.5.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
import torch
from PIL import Image
from transformers import AutoTokenizer, AutoModelForCausalLM
ckpt_path = "internlm/internlm-xcomposer2-7b"
tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(ckpt_path, torch_dtype=torch.float16, trust_remote_code=True).cuda()
You are using a model of type internlmxcomposer2 to instantiate a model of type internlm. This is not supported for all configurations of models and can yield errors.
/home/zhulin1/miniconda3/envs/lmdeployv25/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly.  To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
  return self.fget.__get__(instance, owner)()
Set max length to 4096
Position interpolate from 24x24 to 16x16
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/zhulin1/miniconda3/envs/lmdeployv25/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 561, in from_pretrained
    return model_class.from_pretrained(
  File "/home/zhulin1/miniconda3/envs/lmdeployv25/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3596, in from_pretrained
    model = cls(config, *model_args, **model_kwargs)
  File "/home/zhulin1/.cache/huggingface/modules/transformers_modules/internlm/internlm-xcomposer2-7b/eb081b514df33317480351d78daa2ee54251a111/modeling_internlm_xcomposer2.py", line 67, in __init__
    self.vit = build_vision_tower()
  File "/home/zhulin1/.cache/huggingface/modules/transformers_modules/internlm/internlm-xcomposer2-7b/eb081b514df33317480351d78daa2ee54251a111/build_mlp.py", line 11, in build_vision_tower
    return CLIPVisionTower(vision_tower)
  File "/home/zhulin1/.cache/huggingface/modules/transformers_modules/internlm/internlm-xcomposer2-7b/eb081b514df33317480351d78daa2ee54251a111/build_mlp.py", line 59, in __init__
    self.resize_pos()
  File "/home/zhulin1/.cache/huggingface/modules/transformers_modules/internlm/internlm-xcomposer2-7b/eb081b514df33317480351d78daa2ee54251a111/build_mlp.py", line 88, in resize_pos
    pos_tokens = torch.nn.functional.interpolate(
  File "/home/zhulin1/miniconda3/envs/lmdeployv25/lib/python3.10/site-packages/torch/nn/functional.py", line 4028, in interpolate
    return torch._C._nn.upsample_bicubic2d(input, output_size, align_corners, scale_factors)
RuntimeError: "compute_indices_weights_cubic" not implemented for 'Half'
yuhangzang commented 4 months ago

You may change model = AutoModelForCausalLM.from_pretrained(ckpt_path, torch_dtype=torch.float16, ... to model = AutoModelForCausalLM.from_pretrained(ckpt_path, torch_dtype=torch.float32, ...

zhulinJulia24 commented 4 months ago

You may change model = AutoModelForCausalLM.from_pretrained(ckpt_path, torch_dtype=torch.float16, ... to model = AutoModelForCausalLM.from_pretrained(ckpt_path, torch_dtype=torch.float32, ...

model = AutoModelForCausalLM.from_pretrained(ckpt_path, torch_dtype=torch.float32, trust_remote_code=True).cuda() @yuhangzang it works.

Should we change the code in https://huggingface.co/internlm/internlm-xcomposer2-7b ? And the same error occurs by using https://huggingface.co/internlm/internlm-xcomposer2-vl-7b demo code, how to resolve it ?

yuhangzang commented 4 months ago

Thanks for your suggestion, we have modified the readme file of xcomposer2-7b.