Closed StudyingLover closed 2 months ago
Thank you for this issue. Now, llama.cpp is not support 310B(which is the NPU type in OrangePi), We warmly welcome the adaptation for OrangePi.
Please use feature request instead of bug report.
Okay, do I need to reopen an issue? I couldn't find where to change the label. btw, I can provide an SSH address for an Orange Pi to assist with community development. We really need help with running large model inference on the Orange Pi AI Pro.
You can create another issue and close this one. 310B chip support involves a certain amount of workload, and currently, we don’t have enough manpower to support it. If you’re interested, you’re welcome to contribute support for the 310B.
ok thanks
What happened?
Name and Version
version: 3726 (b34e0234) built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for aarch64-linux-gnu
What operating system are you seeing the problem on?
Linux
Relevant log output