Open thonore75 opened 6 days ago
114 (windows-dev-tensorRT-LLM) OS: Windows 11 Pro (Version 23H2, build 22631.4037) CPU: AMD Ryzen Threadripper PRO 5955WX (16 cores) RAM: 32 GB GPU 1: NVIDIA GeForce RTX 3090 GPU 2: NVIDIA GeForce RTX 3090 Storage: 599 GB local disk (C:)
Mistral 8x7B Instruct Q4
(~24GB) with 2 GPUs turned on. https://github.com/user-attachments/assets/a9741c50-5073-4e51-9344-e136f63f5d0f
Aya 23 35B Q4
(~20GB) when using 2 GPUs as well:https://github.com/user-attachments/assets/37772b49-d524-47f2-ac39-58a49389d670
Deepseek Coder 33B Instruct
(~18GB), I cannot run the model, even when turned on GPU acceleration or not, so it could be a separate issue, reported here: https://github.com/janhq/jan/issues/3703https://github.com/user-attachments/assets/f8808b78-f416-4314-837c-f7e76f4d8eaa
Here's my app logs:
Quick check @thonore75 what are the models that cannot be run on your end?
Here are the models I can launch with 1 GPU but not with 2 :
After my tests, I tried to play a video you posted here (On Google Chrome), no way, no possible to play. Jan was launched but with no model loaded, my last test was a model failing to load. I stopped Jan and I was able to play your videos
Jan Compatibility.xlsx app - 1_GPU_1.log app - 2_GPUs.log app - CPU.log app - 1_GPU_0.log
I did some extra tests! For each tested configuration, the log was cleaned to have separate logs. 4 tested configurations:
For some models, it's was failing sometime after loading issue with previous tested model, but after loading a correct model, the failing model is loading.
Jan version
0.5.3
Describe the Bug
I imported many models and for some of them, they are failing to load if I selected my both graphic cards (RTX 3060 12Go). If I unselect one of them, the model is loaded.
It will be great if the models list could indicate if the models are supporting multi-GPU
Steps to Reproduce
Screenshots / Logs
No response
What is your OS?