-
can ollama URL be configured to point to remote box?
or try use ssh tunnel to make remote ollama appear to be local
-
**Is your feature request related to a problem? Please describe.**
we are exploring around using LaVague for accomplishing web automation but the limitation is using public facing models. can we supp…
-
### System Info
ubuntu 20.04
tensorrt 10.0.1
tensorrt-cu12 10.0.1
tensorrt-cu12-bindings 10.0.1
tensorrt-cu12-libs 10.0.1
tensorrt-llm 0.10.…
-
感谢分享!我有如下错误请您帮助:
Traceback (most recent call last):
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 675, in _get_config_dict
resolved_…
-
I'd like to run live llava completely locally on Jetson including a web browser.
However, if I turn off wifi before starting live llava, the video won't play on the browser.
If I turn off wifi after…
-
**Describe the bug**
Hi, all. Working on a blog article, following a mix of local documentation + Intelligent app workshop, but instead of going Falcon, I've gone with the Mistral 7b model. and at …
-
### System Info
ubuntu 22.04
torch 2.5.0
cuda 12.4
running on a single gpu with CUDA_VISIBLE_DEVICES=1
![image](https://github.com/user-attachments/assets/30134067-427a-4421-94d1-8d958ec628f5)
…
-
开发机:ubuntu 20.04 mnn 3.0.0
模型 huggingface:Qwen2.5-0.5B-Instruct 和 Qwen2.5-0.5B-Instruct-GPTQ-Int8
## 导出 onnx 模型
$ python mnn/transformers/llm/export/llmexport.py --path pretrained_model/Qwen2.5…
-
### Search before asking
- [X] I had searched in the [issues](https://github.com/HamaWhiteGG/autogen4j/issues?q=is%3Aissue) and found no similar feature requirement.
### Description
There i…
-
When I run python llava_llama_v2_visual_attack.py --n_iters 5000 --constrained --save_dir results_llava_llama_v2_constrained_16 --eps 16 --alpha 1, I meet following problems.
model = /mnt/local/LL…