stanfordnlp / dspy

DSPy: The framework for programming—not prompting—language models
https://dspy.ai
MIT License
19.06k stars 1.46k forks source link

How to change order of input and prompt/instruction for prompt caching #1835

Open anhnami opened 1 day ago

anhnami commented 1 day ago

Hi,

I would like to use dspy for extensive Q/A and information extraction on very long input texts. Since dspy builds the prompt based on the Signature and appends input fields, the resulting prompt will change for every question. I want to ask whether we can instruct dspy to place the instruction after the input. This way, we can reuse the KV cache in a self-hosted LLM or save money with OpenAI's prompt caching.

Thanks.

okhat commented 1 day ago

Hey @anhnami ! Just create an input field after the context and pass any information you need to it.

dspy.Predict('context, task -> response')(context=LONG, task="Please do ....")