promptslab / Promptify

Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research
https://discord.gg/m88xfYMbK6
Apache License 2.0
3.21k stars 238 forks source link

Set default parameters for each template #20

Open escesare opened 1 year ago

escesare commented 1 year ago

Currently, these are the default model parameters if the user doesn't specify them

        model_name: str = "text-davinci-003",
        temperature: float = 0.7,
        max_tokens: int = 4000,
        top_p: float = 0.1,
        frequency_penalty: float = 0,
        presence_penalty: float = 0,
        stop: Union[str, None] = None,

But in practice, the typical parameters can be quite different for different tasks. For example, we would typically want temperature = 0 for highly technical tasks like classification or QA. Meanwhile, we typically want penalties to be 0 for most use cases, except for tasks like summarization, where we typically want nonzero penalties.

It would greatly improve out-of-the-box accuracy if we specified different default parameters for each individual template.