-
### How are you running AnythingLLM?
Docker (local)
### What happened?
Trained system from about 100 pdf and text documents. Using default and built-in database. Windows 11. Docker image pulled aro…
-
Please place an "x" in all the boxes that apply
---------------------------------------------
- [X] I have the most recent version of this package and R
- [X] I have found a bug
- [X] I have …
-
@QianHaosheng @bugtig6351 @yuanpcr you can list all potential metrics for the `generate` task in this issue. For more details about the `generate` task, you can refer to issue #12 .
-
### Library name and version
Azure.AI.OpenAI 1.0.0-beta.13
### Describe the bug
We're working on a chat bot that uses a GPT model in an Azure OpenAI service with Retrieval Augmented Generation to a…
-
### Question Validation
- [X] I have searched both the documentation and discord for an answer.
### Question
I understand that in a CitationQueryEngine, we retrieve a couple of reference nodes from…
-
Update the chatbot to add cohere as provider, then update the chat and stream function to call the wrapper.
expected changes:
- Add [cohere provider](https://github.com/intelligentnode/Intelli/b…
-
### Bug Description
I followed the installation and setup steps described in the documentation page. Everything seems to be correctly setup but when I run the starter tutorial code the model doesn'…
-
This comes with some built-in GitHub support and looks as though it's becoming a standard:
https://citation-file-format.github.io/
https://docs.github.com/en/repositories/managing-your-repositorys…
-
If the llama3 from ollama is running on http://8.140.18.**:28275, the following code from 60th example runs fine.
```
from txtai.pipeline import LLM
llm = LLM("ollama/llama3", method="litellm", a…
-
### Bug description
When rendering a quarto document to Github flavoured markdown (GFM), the output does not use this linked GFM citation syntax:
```
ants are great, see [[1]](#1)
## Table of…