-
To scope https://llmstxt.org/
-
Can I use other LLMs? I connect to other remote models via their API, then locally rebroadcast a web server that bridge any OpenAI-compatible HTTP requests to the respective model.
I can see Lumos…
-
**Describe the solution you'd like**
The available LLMs should show LLMs from NVIDIA as well
**Additional context**
This will also require a new place to insert the NVIDIA API key
**Requires**
- [ ]…
-
**Is your feature request related to a problem? Please describe.**
I am trying to rerun the LLM model if the generation is hallucinated, but I am getting a circular dependency error. Is there a way to…
-
I can take a crack at a PR here even though I'm not nec a pythonista - it owuld be nice to sub in other LLM providers or locallly running LLMs
-
The results obtained through an LLM using the same prompt are not always the same. Even defining a value of 0 in the temperature. How did you handle this situation to make comparisons of results in yo…
-
I realize OpenVINO was originally made for vision models but I'm interested in using OpenVINO for fine-tuning LLMs. It appears there is support to fine-tune for ViT models but not for language models…
-
Can we use other APIs and LLMs?
I'm trying to not get stuck in the OpenAI world.
-
I would suggest the `Ollama` api as that is well documented and supports many llms.
-
We’re so happy to have you on board with the LADy project, Calder! We use the issue pages for many purposes, but we really enjoy noting good articles and our findings on every aspect of the project.
…