briansunter / logseq-plugin-gpt3-openai

A plugin for GPT-3 AI assisted note taking in Logseq
https://twitter.com/bsunter
MIT License
701 stars 68 forks source link

Openai settings in prompt templates #74

Open briansunter opened 1 year ago

briansunter commented 1 year ago

Support customizing things like model, temperature, token count, frequency penalty, stop sequences, etc in the prompt templates.

[summarize-text]
name = "Summarize Text"
description = "Summarize a text document"
temperature = 0.7
model = "text-davinci-002'
maximumLength = 500
prompt = '''
Summarize text:
'''
chilang commented 11 months ago

Customizing system prompt per template would be really useful. Perhaps it can be incorporated into the existing prompt template, e.g.: prompt-template:: Custom Prompt

System: Custom system prompt text
Custom prompt text 
aswny commented 6 months ago

This feature would be really helpful.

Further, I would love to see the support for stop_sequences parameter as well. Currently, the models generate text as long as the limit set in settings.