-
## Release checklist:
- [x] Create a new [Release Issue](https://github.com/enricoros/big-AGI/issues/new?assignees=enricoros&projects=enricoros/4&template=maintainers-release.md&title=Release+1.16.…
-
I tried using the openai and groq api code as a template for creating a Mistral api pathway in Jan...but nothing worked. Whoever set these up could probably add mistral in a few seconds.
As the misr…
-
Do you support Databricks foundation model APIs?
https://docs.databricks.com/en/machine-learning/foundation-models/index.html
Thanks,
Ravi.
-
Please add support for litellm because it will help us in using 100+ LLM's easily. We could call all different LLM APIs using the OpenAI format so it will reduce our burden alot because we wouldnt ha…
-
Saw this LM-Studio Cookbook,
```python
from phi.assistant import Assistant
from phi.llm.openai.like import OpenAILike
from phi.tools.duckduckgo import DuckDuckGo
assistant = Assistant(
ll…
-
This happens a lot with LLAMA3 70B
from fix_busted_json import repair_json
invalid_json = "{ name: 'John' "
fixed_json = repair_json(invalid_json)
Throws an exception:
Except…
-
### Bug Description
client = qdrant_client.QdrantClient(location=":memory:")
#
vector_store = QdrantVectorStore(client=client, collection_name="paul_graham")
storage_context = StorageContext.from_…
-
# Problem
Recent versions of `openai` lib require the `OPENAI_API_KEY` on `openai.OpenAI()` init.
This cascades down to `bambooai` on the following line (see traceback below): https://github.com…
-
### What happened?
I try to reproduce the code from the [docs](https://litellm.vercel.app/docs/completion/prompt_formatting):
```python
import litellm
# Create your own custom prompt templat…
-
### How are you running AnythingLLM?
Docker (local)
### What happened?
The Agent query and response history from the 'default' thread spill over into newly created threads.
Steps:
- Do a…