-
Great work with integrated data chunking and embedding.
Asking from a RAG context (complex documents, e.g. financial statement) , great work with integrated data chunking and embedding, how does th…
-
Might be useful to be able to do this:
```bash
git diff | llm chat -m gpt-4 --in
```
Where the content doesn't actually get sent to the model at all (yet), but is instead added to the history an…
-
I want to use the OpenAI package in Python to call my application that is built based on TaskingAI.
Could you provide the API call interface in OpenAI format?
Much appreciated!
-
Hi is there a way to train my own model to include on one of the base models?
Example llava-v1.5-7b-q4-server.llamafile --external-model my_model.model
-
Issue is WIP and will be further refined.
LLM and embedding model sources are currently defined in GUC, e.g. `vectorize.openai_service_url = https://api.openai.com/v1` contains the base url for Ope…
-
### Bug Description
Trying to execute:
```python
from llama_index.core import SimpleDirectoryReader
from llama_index.readers.file import PyMuPDFReader
reader_config = SimpleDirectoryReader(
…
-
### What happened?
How can i config the Quivr to use local Ollama model ? i can't find the config example and no env.example file
### Relevant log output
```bash
How can i config the Quivr to use …
-
To do:
- [ ] Create an icon to the top-left of the chatbot's chat box that shows the model company (e.g., Open AI, Google), along with some text indicating the model
- [ ] Make it so the recent chats …
-
After stream run 1_home.py. How I can build a agent? by input what and how I can upload or point a file, have a example? No matter what I input, it always be:
system_prompt=None file_paths=[] docs…
-
- [x] Choose 3 textbook diagrams as examples, in the same category
- [x] Choose 2 additional textbook examples, in different categories, to keep in mind but not pursue directly in detail during this…