-
### Issue
I just wanted to check /help function to ask questions. After two questions I checked /tokens and what I saw:
![image](https://github.com/user-attachments/assets/0a40cb7c-7e58-443c-8f1…
-
### Question Validation
- [X] I have searched both the documentation and discord for an answer.
### Question
I have already made a client for Claude3 with complete and acomplete and it works …
-
I have a basic example of DSPy where I am hoping to compare different LLM performance.
In the example I hope to try running llama3 8b and 70b, however, 70b throws an error re: `max_tokens` whereas th…
-
### Issue
Since updating to 0.512 it is more likely for Aider to not make commits when in code mode, but rather just offer suggestions as if in chat mode. When this happens it is usually resolved by …
-
### Issue
On both the test and lint commands, aider fails to auto execute because it's running under (i believe) sh. However, i use paths/etc in my shell to point out the locations of these binaries.…
-
**Is your feature request related to a problem? Please describe.**
Want to use Anthropic prompt caching, but doing so requires accessing the beta version of the Anthropic client. It appears the Instr…
-
Thanks for adding tool/with_structured_output support to ChatBedrockConverse!
It is working great for models like mistral_large. However, the Llama 3.1 models don't work with with_structured_output…
-
www.zerodochat.com/chat
https://shared.oaifree.com/dashboard
https://sdk.vercel.ai
websites with GPT-4o
https://api.bbff.bf/register?aff=3Laf
Another my apply key
API:https://imhaiku.pages.dev…
-
### Issue
I get the following error message after having added a .jpg image to the chat:
```
Traceback (most recent call last):
File "[redacted]/.local/bin/aider", line 8, in
sys.exit(ma…
-
I have the following configuration:
```
defaultTest:
options:
provider:
text:
id: bedrock:anthropic.claude-3-haiku-20240307-v1:0
config:
region: eu-cent…