-
Unable to connect api.openai.com in some countries, so just add config for this
-
### Issue
I used:
FIREWORKS_AI_API_KEY = "mykey" // in .env file
And I config .aider.model.metadata.json as bellow:
```
"fireworks_ai/deepseek-coder-v2-instruct": {
"max_tokens": 655…
-
### Describe the bug
when providing an assistant ID for GPTAssistantAgent. the code pathway at line 117 always has a None value for variables "instructions" and "specified_tools". this is because the…
-
# bug
I'm using the API of Tongyi Qianwen in proxy mode. I've successfully experimented with it on Lobechat, but I'm encountering network errors with this plugin.
![image](https://github.com/logan…
-
**Is your feature request related to a problem? Please describe.**
The server url fpr openai api is not applicable to azure openai Endpoint url.
**Describe the solution you'd like**
OPENAI_TYPE=A…
Mshz2 updated
2 months ago
-
Support OpenAI API format by giving option to switch between Ollama proprietary API format and OpenAI API format.
To fetch list of models - https://platform.openai.com/docs/api-reference/models/lis…
-
![image](https://github.com/excalidraw/excalidraw/assets/4499997/6db813f4-ee34-4ab9-b1f5-5ee782207ce0)
The Current AI feature can only use the OpenAI official API key, can we also support use other…
-
my code: # -*- coding: utf-8 -*-
import os
import ollama
from …
nj159 updated
2 months ago
-
### Feature Description
Hi everyone,
We're currently working on an [open-source](https://github.com/merlinn-co/merlinn) project that uses llama-index in order to ingest + embed some data into Ch…
-
### The Feature
- User wants to pass their OpenAI, Anthropic API Key in the request header
- LiteLLM should use the api key passed in the header to make the LLM API Call
- This should be opt in, `l…