Closed hlh2023214 closed 8 months ago
Hi @hlh2023214,
Thank you for your interest in our work. Most probably, the issue could be related to the unsupported transformers version. Please follow the following steps and let me know if it resolves the issue.
# Stop the demo
pip install -r requirements.txt
# Run the demo again and verify if the issue is still there.
# If the issue persists, please try downloading the model checkpoints again from HuggingFace.
In case if it does not solve the issue, please share the output of conda list
command to further investigate. Thank You.
Hi @hlh2023214,
Thank you for your interest in our work. Most probably, the issue could be related to the unsupported transformers version. Please follow the following steps and let me know if it resolves the issue.
# Stop the demo pip install -r requirements.txt # Run the demo again and verify if the issue is still there. # If the issue persists, please try downloading the model checkpoints again from HuggingFace.
In case if it does not solve the issue, please share the output of
conda list
command to further investigate. Thank You.
My version of the transformer was consistent, but later I found out that the Llama model I used was incorrect. I was using the Llama-2-7b-hf, which is not the same as your version, hence the garbled phenomenon occurred. Thank you, my issue has been resolved.
Hi @mmaaz60 I have followed the steps in the offline_demo.md meticulously for local deployment, but I do not know why the output responses are always garbled. Could you please help me to solve the issue?