Open cynthia-nlp opened 7 months ago
The issue you encounter is caused by the "+=" operator. "+=" is used for concatenation, meaning you're trying to concatenate the string "f"{d}\n" to the array variable prompt. Try using prompt.append("f"{d}\n") instead and see if that solves the issue.
The issue you encounter is caused by the "+=" operator. "+=" is used for concatenation, meaning you're trying to concatenate the string "f"{d}\n" to the array variable prompt. Try using prompt.append("f"{d}\n") instead and see if that solves the issue.
Thank you for your reply! prompt
is a string variable and I verified this with isinstance()
. After concatenating, prompt
is still a string variable. When I set max_seq_length
to 512, I found that it got stuck when decoding to about 500 during generation. Later, by coincidence, I set a random seed and the problem was solved, although I still don't know why.
I encountered some problems when using Llama2-70b-chat to generate some sentences. Specifically, I constructed a prompt template similar to:
The corresponding code is implemented as:
sentences
is a list of strings from which I randomly sample five sentences as demonstrations. After running, the output of Llama either does not answer the question, or it freezes and does not respond. However, if I modify the code to:The code runs successfully. I tried commenting out different parts and found that the code runs successfully when I remove the following:
So what went wrong, and why does string concatenation cause decoding to fail?