PharMolix / OpenBioMed

MIT License
673 stars 75 forks source link

Questions about ruuning biomedgpt inference codes #29

Closed xt-mao closed 1 year ago

xt-mao commented 1 year ago

Hi, I tried to run the "biomedgpt_inference.ipynb", but got some errors.

print("Assistant: ", chat.answer()[0])

However...,

Traceback (most recent call last) /tmp/ipykernel_118030/3402149863.py:1 in

[Errno 2] No such file or directory: '/tmp/ipykernel_118030/3402149863.py'

/ssddata/xxx/AImodels/OpenBioMed/OpenBioMed/open_biomed/utils/chat_utils.py:106 in answer

103 │ │ │ embs = embs[:, begin_idx]
104 │ │ │ logger.warn("The number of tokens in current conversation exceeds the max le
105 │ │
❱ 106 │ │ output = self.model.llm.generate(
107 │ │ │ inputs_embeds=embs,
108 │ │ │ max_length=max_new_tokens,
109 │ │ │ num_beams=num_beams,

/sasdata/xxx/software/miniconda3/envs/OpenBioMed/lib/python3.9/site-packages/torch/utils/_contex tlib.py:115 in decorate_context

112 │ @functools.wraps(func)
113 │ def decorate_context(*args, *kwargs):
114 │ │ with ctx_factory():
❱ 115 │ │ │ return func(
args, **kwargs)
116 │
117 │ return decorate_context
118

/sasdata/xxx/software/miniconda3/envs/OpenBioMed/lib/python3.9/site-packages/transformers/genera tion/utils.py:1437 in generate

1434 │ │ │ │ )
1435 │ │ │
1436 │ │ │ # 11. run greedy search
❱ 1437 │ │ │ return self.greedy_search(
1438 │ │ │ │ input_ids,
1439 │ │ │ │ logits_processor=logits_processor,
1440 │ │ │ │ stopping_criteria=stopping_criteria,

/sasdata/xxx/software/miniconda3/envs/OpenBioMed/lib/python3.9/site-packages/transformers/genera tion/utils.py:2248 in greedy_search

2245 │ │ │ model_inputs = self.prepare_inputs_for_generation(input_ids, model_kwargs)
2246 │ │ │
2247 │ │ │ # forward pass to get next token
❱ 2248 │ │ │ outputs = self(
2249 │ │ │ │
model_inputs,
2250 │ │ │ │ return_dict=True,
2251 │ │ │ │ output_attentions=output_attentions,

/sasdata/xxx/software/miniconda3/envs/OpenBioMed/lib/python3.9/site-packages/torch/nn/modules/mo dule.py:1501 in _call_impl

1498 │ │ if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks
1499 │ │ │ │ or _global_backward_pre_hooks or _global_backward_hooks
1500 │ │ │ │ or _global_forward_hooks or _global_forward_pre_hooks):
❱ 1501 │ │ │ return forward_call(*args, **kwargs)
1502 │ │ # Do not call functions when jit is used
1503 │ │ full_backward_hooks, non_full_backward_hooks = [], []
1504 │ │ backward_pre_hooks = []

/ssddata/xxx/AImodels/OpenBioMed/OpenBioMed/open_biomed/models/multimodal/biomedgpt/modeling_lla ma.py:676 in forward

673 │ │ return_dict = return_dict if return_dict is not None else self.config.use_return
674 │ │
675 │ │ # decoder outputs consists of (dec_features, layer_state, dec_hidden, dec_attn)
❱ 676 │ │ outputs = self.model(
677 │ │ │ input_ids=input_ids,
678 │ │ │ attention_mask=attention_mask,
679 │ │ │ position_ids=position_ids,

/sasdata/xxx/software/miniconda3/envs/OpenBioMed/lib/python3.9/site-packages/torch/nn/modules/mo dule.py:1501 in _call_impl

1498 │ │ if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks
1499 │ │ │ │ or _global_backward_pre_hooks or _global_backward_hooks
1500 │ │ │ │ or _global_forward_hooks or _global_forward_pre_hooks):
❱ 1501 │ │ │ return forward_call(*args, **kwargs)
1502 │ │ # Do not call functions when jit is used
1503 │ │ full_backward_hooks, non_full_backward_hooks = [], []
1504 │ │ backward_pre_hooks = []

/ssddata/xxx/AImodels/OpenBioMed/OpenBioMed/open_biomed/models/multimodal/biomedgpt/modeling_lla ma.py:517 in forward

514 │ │ │ )
515 │ │ │ position_ids = position_ids.unsqueeze(0).view(-1, seq_length)
516 │ │ else:
❱ 517 │ │ │ position_ids = position_ids.view(-1, seq_length).long()
518 │ │
519 │ │ # embed positions
520 │ │ if attention_mask is None:
RuntimeError: shape '[-1, 94]' is invalid for input of size 95

Could you please help me debug it? Thanks a lot.

Best,

icycookies commented 1 year ago

I guess the issue stems from varying versions of the transformers package. 4.31.0 seems to work in my case.

xt-mao commented 1 year ago

It worked successfully. Thanks~

It would be better if you wrote down the important installation package version number in the "requirements.txt" document.

Best : )