bifrostlab / llm-assist-prompts

Prompts library for LLM Assist bot
1 stars 1 forks source link

Provide the OpenAI prompt parameters to emulate the experience of ChatGPT app #1

Open nqngo opened 5 months ago

nqngo commented 5 months ago

ChatGPT is very verbose and provide a fair bit of contextual information on the prompt provide.

When /ask is implemented, what are the parameters, token size, words penalty tuning we need to provide to OpenAI to emulate that experience?

Please investigate and provide the parameters needed.

dacphuc1993 commented 4 months ago

Here is my suggestion for default parameters of OpenAI API

frequency_penalty = 0           # penalty for repeating tokens, if set to high -> lower likelihood of repeating word
presence_penalty = 0             # similar to penalty above. Keeping default = 0 for the balance
logprobs = False                    # whether to return log probabilities in output object
max_tokens = 1000                # maximum number of tokens generated by models
n = 1                                       # number of response generated by models
seed = 1000                # for reproducible output, must keep this consistent across the codebase
stream = False            # whether to stream the output -> display text as stream
temperature = 0.1      # control the creativity of models. Lower values indicate more factual text