Open sungh66 opened 1 year ago
@sungh66
Whats the diffenrences between the "tprompt" and "prompt" in config.yaml?
From the config.yaml, the "tprompt", in combination with the few shot examples "demos_or_presteps", is used as part of the system message/prompt to "guide an AI system’s behavior and improve system performance" [1]. The "prompt" is used to wrap the user's chat as part of the user message/prompt and provides additional instruction/requests related to the stage.
References: [1] https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/system-message
I want to add some custom operations, such as file read and write local modules, is this possible? Beacuse i think AutoGPT is too flexible. Also, how does the code reflect the process of matching the steps planned by llm with the introduction of the model in huggingface? Whats the diffenrences between the "tprompt" and "prompt" in config.yaml?