meta-llama / llama

Inference code for Llama models
Other
56.06k stars 9.53k forks source link

stop criterion #794

Open liugaohong opened 1 year ago

liugaohong commented 1 year ago

How to implement llama's stopping criterion? Specifically, if you enter a stopping criterion like chatgpt, you can stop after outputting 'observation'. for example, when chatgpt input:

response = openai.Completion.create(
      model="text-davinci-002",
      prompt=prompt,
      temperature=0,
      max_tokens=100,
      top_p=1,
      frequency_penalty=0.0,
      presence_penalty=0.0,
      stop=stop
    )
jspisak commented 1 year ago

@amitsangani - do you have an example from your NB or somewhere else to share here?