-
Requires extensive automated and manual testing and code changes (imports) which are part of https://github.com/Chainlit/chainlit/pull/1267.
-
Search is NOT limited to given txt file.
`
from crewai_tools import TXTSearchTool
txt_search_tool = TXTSearchTool(
txt="kunst.txt",
config=dict(
llm=dict(
provid…
-
## ❓ InternalError when running llava model
Im new to mlc-llm and I'm not sure if this is a bug or me doing something incorrectly. I have so far not managed to run any model successfully. I have tr…
plufz updated
1 month ago
-
When I run python llava_llama_v2_visual_attack.py --n_iters 5000 --constrained --save_dir results_llava_llama_v2_constrained_16 --eps 16 --alpha 1, I meet following problems.
model = /mnt/local/LL…
-
### Your current environment
The output of `python collect_env.py`
```text
Collecting environment information...
PyTorch version: 2.4…
-
I have installed trl
-
I don't understand to set the chat_llm to ollama, if there is no preparation for utility_llm and/or embedding_llm to set it to local (ollama) pendants. Yes, I assume that prompting will be a challenge…
-
### Your current environment
The output of `python collect_env.py`
```text
PyTorch version: 2.4.0+cu121
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A…
jgen1 updated
3 weeks ago
-
I'd like to run live llava completely locally on Jetson including a web browser.
However, if I turn off wifi before starting live llava, the video won't play on the browser.
If I turn off wifi after…
-
(unfortunately you will need a physical Pixel 8 or above to implement this)
Many Commons contributors contribute in various languages, for instance in Urdu when posting a picture of a local dish th…