-
DSPy is very powerful and has helped me a lot in my work. Currently, the DSPy library only has chain of thought. Perhaps it could be possible to implement tree of thought (https://github.com/kyegomez/…
-
Issue to track implementation/research into multi-hop retrieval
Meaning the RAG system will perform query decomposition, to breakdown the original query into pieces, obtain answers relevant those, an…
-
## 🚀 Feature Request
**💡 Got a brilliant idea?**
Using either OpenInference or OpenLLMetry, add an instrumented for DSPY. This is the first of several test features we'll use to determine which is…
-
If you configure the LM (`dspy.settings.configure(lm=lm)`) before initializing your modules you get a different prompt to if you configure your LM after initializing your modules.
```python
ds…
-
Right now, the current state of LLMs involves a lot of "prompt engineering" that the user needs to do, not only to elicit the the correct response but sometimes also the correct style of response (i.e…
-
In my current setup, I write everything in DSPy, then I extract the prompt form the dspy module. Then, I use that prompt with litellm to stream the output to the user(if the module is chain of thought…
-
Hi!
I created a simple module and a set of 10 questions and answers to evaluate a single pdf loaded into chromadb. When evaluating using DSPy version 2.5.16 like
```python
evaluate = dspy.Evaluat…
-
- [ ] [dspy/README.md at main · stanfordnlp/dspy](https://github.com/stanfordnlp/dspy/blob/main/README.md?plain=1)
# dspy/README.md at main · stanfordnlp/dspy
## DSPy: _Programming_—not promp…
-
There remain some questions about the right prompt for the behaviour of the different models; llama series models seem to handle prompts differently than GPT. As an initial experiment, DSPy will be us…
-
I have successfully created an index and I can easily search using
RAG = RAGPretrainedModel.from_index("my_index")
results = RAG.search(query,k=2,index_name="/m_index")
How do i integrate it in…