-
lm-studio is an open source tool for running LLMs locally. After having done this by hand (as the plugin is doing), and trying other tools to do this, it's clear (to me) that lm-studio is going to be…
-
**Is your feature request related to a problem? Please describe.**
I need a better llm to process chat completion.
**Describe the solution you'd like**
Make llama3 in localai. and support f…
-
Thank you for your efforts on this project! I'm excited to get it running properly.
The LLM text generation seems to work fine, but when I try to use the vision tab I get the below error. Once this…
CCpt5 updated
3 months ago
-
Need a way to build a prototype because the [PRD ](https://gist.github.com/dctmfoo/eac3a1296e349188c96b04f0d221d733 )looks extensive and features probably not required during the first run
> text:…
-
Please add spellchecking option for each input box, per conversation.
-
### What would your feature do?
LMStudio support for various llm translation llama3 etc
-
I accidentally posted a bug in the cli version of the bug tracker,
Bug fix: Flash Attention - KV cache quantization is stuck at FP16 with no way to revert to Q4_0
The gist of it is, no way to …
-
Hello, After installing ChatRTX ChatGLM 3 6B int4, the runtime prompt reads:
ChatRTX
Problem generating response: Data source may be empty or unsupported – Ensure dataset compatibility with the AI mo…
-
It will we great if we can interact with other services that are not running locally, using the LM Studio only as GUI by default (of course, we will loose the capability to load and run locally in a …
-
### Describe the bug
The environment starts correctly, all models from LMStudio are also visible in bolt, to be selected from the dropdown list.
My configuration:
Windows 10 where I have LLM Stu…