-
Hi, could be possible to support koboldcpp? It is faster and loads more models than lm studio and has a better compatibility with linux.
In fact, it connects using the lm studio option and writing …
-
**Describe the bug**
I've tried to use WizardCoder for FIM and it's not working, here is the log for LM Studio:
```
[2024-04-21 11:39:29.795] [INFO] [LM STUDIO SERVER] Success! HTTP server listenin…
-
### How are you running AnythingLLM?
AnythingLLM desktop app
### What happened?
I'm using latest AnythingLLM Version 1.1.1 (1.1.1) on a M2 Mac Studio.
I tried to embed some PDF or TXT files wit…
-
### 🚀 The feature
Dear devs, great project. Would be awesome if we could add support for LM Studio and the local server API given the rise in popularity.
Local API:
`# Example: reuse your existi…
janhp updated
2 months ago
-
**Describe the bug**
Error occured during streaming for getting response.
**Log and Stack trace**
2024-02-24T01:17:40.732+08:00 ERROR 27155 --- [alhost:1337/...] c.m.m.l.CustomStreamingResponseH…
-
First thanks for the energy and effort that has been put into this plugin.
I found two additional local opensource options
I discovered https://github.com/janhq/jan when I was researching LM St…
-
### Describe the bug
I'm using OI with local models as follows:
```bash
interpreter \
-y \
--api_base http:://localhost:8000 \
--model openai/gpt-3.5 \
--context_window 4096 \
--api_k…
-
godrm updated
2 months ago
-
### How are you running AnythingLLM?
Docker (local)
### What happened?
Downloaded the appimage, tried to run it.
Selected lmstudio and the included embedder.
Would not create a workspace just…
-
![info](https://github.com/ivan-hc/AM/assets/6384793/1e6e22e0-cc97-4e44-b532-5f137d3b3d42)
Here somewhere shoould be mention of am website