-
### System Info
# !pip install trl transformers==4.35.2 accelerate peft==0.6.2 -Uqqq
!pip install trl transformers accelerate peft==0.6.2 -Uqqq
!pip install datasets bitsandbytes einops wandb -Uqqq…
-
### Your current environment
```text
PyTorch version: 2.2.1+cu121
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A
OS: Ubuntu 22.04.3 LTS (x86_64)
GCC ve…
-
### This issue is to have a centralized place to list and track work on adding support to new ops for the MPS backend.
[**PyTorch MPS Ops Project**](https://github.com/users/kulinseth/projects/1/vi…
-
Thanks for the good work! I tried to load the model via HF, but I am getting this error:
![image](https://github.com/mbzuai-oryx/LLaVA-pp/assets/565559/c8dcac9f-1e04-40e8-9845-48e874439e9f)
Any …
-
/vllm_2$ python examples/phi3v_example.py
WARNING 06-21 14:53:06 ray_utils.py:46] Failed to import Ray with ModuleNotFoundError("No module named 'ray'"). For multi-node inference, please install Ray …
-
### Your current environment
```
PyTorch version: 2.1.2
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A
OS: Ubuntu 20.04.6 LTS (x86_64)
GCC version: (Ub…
-
Due to the early flood of feedback for Doom Eternal while the community figured out how to get this game to run, the discussion for this game has been reset. If you have an interest in the community e…
-
### Your current environment
PyTorch version: 2.4.1+cu121
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A
OS: Ubuntu 22.04.3 LTS (x86_64)
GCC version: (U…
-
I might be misinterpreting, but it looks like only the 4k context length phi3 model is supported currently: https://ollama.com/library/phi3
(at least without downloading the weights separately and …
-
Hi,
I wanted to try it out for testing purpose. For that I downloaded both the `llava_med_in_text_60k_delta.zip` and llama weights. but when I tried to run following command -
```
python3 -m ll…