Closed DavidCincotta closed 1 year ago
Hi, @DavidCincotta! I'm here to help the LangChain team manage their backlog, and I wanted to let you know that we are marking this issue as stale.
From what I understand, you opened this issue requesting an extension to the prompt module that would allow for generating new prompts. You were seeking input from others who may have implemented or are currently working on a similar process. However, there hasn't been any activity on the issue yet.
Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your understanding and contribution to the LangChain project!
I think the prompt module should be extended to support generating new prompts. This would create a better sandbox for evaluating different prompt templates without writing 20+ variations by hand. The core idea is to call a llm to alter a base prompt template while respecting the input variables according to an instruction set. Maybe this should be its own chain instead of a class in the prompt module. This scheme for generating prompts can be used with evaluation steps to assist in prompt tuning when combined with evaluations. This could be used with a heuristic search to optimize prompts based on specific metrics: total prompt token count, accuracy, ect. I'm wondering if anyone has seen this type of process implemented before or is currently working on it. Starting to POC this type of class today.
edit: wording