-
I'm attempting to use an azure-hosted chatgpt model with curateGPT via the litellm proxy system.
I have the proxy running and tested and the LLM package underlying curateGPT successfully using it, …
goodb updated
2 weeks ago
-
**Describe the bug**
Hi, all. Working on a blog article, following a mix of local documentation + Intelligent app workshop, but instead of going Falcon, I've gone with the Mistral 7b model. and at …
-
I don't understand to set the chat_llm to ollama, if there is no preparation for utility_llm and/or embedding_llm to set it to local (ollama) pendants. Yes, I assume that prompting will be a challenge…
-
### Question Validation
- [X] I have searched both the documentation and discord for an answer.
### Question
I'm executing following line of code:
```
new_index.storage_context.persist(pers…
-
## 🐛 Bug
Do not running Llama-3-8B-Instruct-q4f16_1-MLC
## To Reproduce
Steps to reproduce the behavior:
1. conda create --name mlc-prebuilt python=3.11
2. conda activate mlc-prebuilt
3…
-
i saw this error :
```
value is not a valid list (type=type_error.list))
Evaluating: 33%|█████████████████████████████▎ | 1/3 [01:37 0
v…
-
**Describe the bug**
When running `chat-with-wikipedia` flow, I'm encountering the following error:
```
...
File "/usr/local/lib/python3.9/site-packages/promptflow/tracing/_trace.py", line 145, in
…
-
When I run python llava_llama_v2_visual_attack.py --n_iters 5000 --constrained --save_dir results_llava_llama_v2_constrained_16 --eps 16 --alpha 1, I meet following problems.
model = /mnt/local/LL…
-
- [ ] [paper-qa/README.md at main · Future-House/paper-qa](https://github.com/Future-House/paper-qa/blob/main/README.md?plain=1)
# PaperQA2
[![GitHub](https://img.shields.io/badge/github-%23121011.s…
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a sim…