-
I downloaded the notebook on my computer and tried to run it. I did not change anything regarding the retrieval model but this line gives me a weird error...
`topK_passages = retrieve(dev_example.que…
-
Hi. I do apologize if this is in the docs somewhere, but so far all I can see if the experimental feature in the works for supporting a local Hugging Face non-TGI-setup. What I am looking for is a gen…
-
I'm trying to use `langchain` to replace current use QDrant directly, in order to benefit from other tools in `langchain`, however I'm stuck.
I already have this code that creates QDrant collection…
-
DSPy is using `openai
-
**Describe the bug**
Letters with diacritics mark are wrong in some traces.
**To Reproduce**
1. invoked instrumented Langchain with some diacritics symbols: "naïve façade café"
2. check Phoenix…
-
It seems missing a key feature: "system prompt"
"system prompt" impacts gpt-model significantly
at least let as can put in config or somewhere else?
xdite updated
7 months ago
-
I have tried using Llama V2 to generate synthetic data for self instruct. Unfortunately my Prompts are long and the prompt / response combination from the Llama 13b chat model constantly exceeds the 4…
-
I don't care too much, but thought I would point out - DSP has a well established meaning in a field not too far removed from NLP
-
It seems that retrieval from Qdrant is limited to the following models:
```python
SUPPORTED_EMBEDDING_MODELS: Dict[str, Tuple[int, models.Distance]] = {
"BAAI/bge-base-en": (768, models.Dista…
fronx updated
8 months ago
-
I am thinking about writing a program for a "batch processing" task, i.e., each call to the program, there will be a list of inputs, and the output is a corresponding list for the inputs. To optimize …