Open liugaohong opened 1 year ago
How to implement llama's stopping criterion? Specifically, if you enter a stopping criterion like chatgpt, you can stop after outputting 'observation'. for example, when chatgpt input:
response = openai.Completion.create( model="text-davinci-002", prompt=prompt, temperature=0, max_tokens=100, top_p=1, frequency_penalty=0.0, presence_penalty=0.0, stop=stop )
@amitsangani - do you have an example from your NB or somewhere else to share here?
How to implement llama's stopping criterion? Specifically, if you enter a stopping criterion like chatgpt, you can stop after outputting 'observation'. for example, when chatgpt input: