-
### How are you running AnythingLLM?
Docker (local)
### What happened?
The Agent query and response history from the 'default' thread spill over into newly created threads.
Steps:
- Do a…
-
Recently I got a flow working where I would train a model with mlx (this is new for me) and then move over to llama.cpp to do the conversion to gguf in order to run it on LMStudio locally. However wit…
-
### How are you running AnythingLLM?
Docker (local)
### What happened?
Can’t upload a whole folder of documents. The folder contains other folders that contain documents.
For some reason,…
-
![info](https://github.com/ivan-hc/AM/assets/6384793/1e6e22e0-cc97-4e44-b532-5f137d3b3d42)
Here somewhere shoould be mention of am website
-
### 是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
### 该问题是否在FAQ中有解答? | Is there an existing ans…
-
Really new to this but i got a few error. I mention the bot than it try to reply but didnt reply. Here is the error
![image](https://github.com/jakobdylanc/discord-llm-chatbot/assets/89128767/7ff0d05…
-
### Is your feature request related to a problem? Please describe.
i tried to run ur interpreter with --local via wsl windows linux ( debian python 3.12 venv pip )
before , run lmstudio windows vers…
-
Hello,
as you certainly know LM Studio is a simple tool to test LLM in a local PC. LM Studio also have a server that mimic an OpenAI api. Is it possible to configure this tool to use for example Mi…
-
**Is your feature request related to a problem? Please describe.**
Nope its a feature request
**Describe the solution you'd like**
I would like to get LM Studio as a LLM backend supported. It is …
-
> The optional server-side functionality of fabric is called the Mill.
Is the Mill only meant to interact with OpenAI or is it possible to setup a Mill with a local LLM via something like LMStudio/…