-
Very good paper.
I hope to have a detailed introduction to the "basic training form of dense retrieval" mentioned in the Distant supervision section on page 5. Does it train the query and answer of M…
-
### Issue
In https://aider.chat/docs/repomap.html, it says:
> Of course, for large repositories even just the repo map might be too large for the LLM’s context window. Aider solves this problem by…
-
### Describe the bug
When we have larger text names the icon of the actvitiy/event gets shrunk in the variables tab
### Steps to reproduce
1. Create a activity/event with a larger name that h…
-
When opening big files it would be useful to collapse large regions of identical bytes so that more differences can be shown on one screen.
Similar to what `diff` does when it collapses identical l…
-
**Description**
**Expected behaviour**
**What is happening instead?**
**Additional context**
**How to reproduce?**
**Files**
-
**Describe the bug**
This started recently and I'm uncertain as to the cause. I have a rather large S3 directory (10TB) that I delete using the following code:
```
let object_store_s3_path = &o…
-
## Summary
We currently naively build the prompt with all the text of the returned docs from similarity search done by the vector db.
If this results in a prompt larger than the context window of …
-
### 🚀 The feature, motivation and pitch
In the context of a text-only large language model (LLM), it often truncates the input from the left to ensure the conversation can continue within token limit…
-
Pulling the docs bucket using the CLI just now I hit `Error! Context deadline exceeded`.
-
### Describe the bug
i am using llama3.1 70b with ollama which has a extreamly large context window. When I set the MessageTokenLimiter, it assumes that i am using gpt-3.5-turbo-0613 which i am not. …