Closed JokerZhY closed 1 month ago
Hey @JokerZhY, I'm not sure how that relates to transformers
, do you have a code sample I could run to see where the issue lies?
Hey @LysandreJik, I encountered an error while using text-generation-webui .3, specifically when I was importing the llama3.1 model. At that time, I had just updated transformers to version 4.43.3 and now suspect there might be a conflict between libraries. When I output input_ids.device, it correctly shows cuda:0.
@oobabooga, please let us know if we can help with anything here
This may be helpful for oobabooga. I successfully loaded the model "hugging-quants/Meta-Llama-3.1-70B-Instruct-AWQ-INT4" with AutoAWQ in text-generation-webui (not latest version of webui). But it does not generate output and gives the same error.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
System Info
jetson-agix-orin
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Traceback (most recent call last): File "/opt/text-generation-webui/modules/callbacks.py", line 61, in gentask ret = self.mfunc(callback=_callback, args, self.kwargs) File "/opt/text-generation-webui/modules/text_generation.py", line 382, in generate_with_callback shared.model.generate(kwargs) File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 1975, in generate self._get_logits_warper(generation_config, device=input_ids.device) TypeError: get_logits_warper_patch() got an unexpected keyword argument 'device'
Expected behavior
Answer my question