Ucas-HaoranWei / Vary

[ECCV 2024] Official code implementation of Vary: Scaling Up the Vision Vocabulary of Large Vision Language Models.
1.71k stars 150 forks source link

keep generate duplicate content #20

Open qiufengyuyi opened 8 months ago

qiufengyuyi commented 8 months ago

i deploy the model using A100 , with

output_ids = model.generate(
            input_ids,
            images=[(image_tensor.unsqueeze(0).half().cuda(), image_tensor_1.unsqueeze(0).half().cuda())],
            do_sample=True,
            num_beams = 1,
            temperature=0.1,
            streamer=streamer,
            max_new_tokens=2048,
            repetition_penalty=1.05,
            stopping_criteria=[stopping_criteria]
            )

it always generate duplicate content, i have used different generation config ,but it did not help. by the way ,when i use the following code to load model:

disable_torch_init()
# model_name = os.path.expanduser(model_name)

tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)

model = varyQwenForCausalLM.from_pretrained(model_name, low_cpu_mem_usage=True, device_map='cuda', trust_remote_code=True)

model.to(device='cuda',  dtype=torch.bfloat16)

image_processor = CLIPImageProcessor.from_pretrained(clip_model, torch_dtype=torch.float16)

i get the following warning:

/opt/anaconda3/envs/qiu_chatglm3/lib/python3.10/site-packages/torch/nn/modules/module.py:2025: UserWarning: for vision_model.encoder.layers.23.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)

is there any problem with this warning?

yxc0915 commented 8 months ago

我使用demo也遇到了同样的情况屏幕截图 2023-12-25 185503

yangbaoquan commented 8 months ago

I have met with the same problem with device_map='auto' on 3 Nvidia RTX4090. It seems that the output has nothing to do with the input image. I wonder if there is something wrong with the inference code. By the way, it outputs different results even if I use the same image and prompts as input.

Salomeeeee commented 6 months ago

Encountered the same problem

supergangchao commented 2 months ago

me too

  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/torch/nn/modules/module.py:2047: UserWarning: for vision_model.encoder.layers.23.layer_norm2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/torch/nn/modules/module.py:2047: UserWarning: for vision_model.encoder.layers.23.layer_norm2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/torch/nn/modules/module.py:2047: UserWarning: for vision_model.post_layernorm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/torch/nn/modules/module.py:2047: UserWarning: for vision_model.post_layernorm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
已杀死