[X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[X] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
[X] Please do not modify this template :) and fill in all the required fields.
1. Is this request related to a challenge you're experiencing? Tell me about your story.
In today's world we have quite powerful igpu's(Integrated Gpu) like amd ryzen 780M and 880M which are performing very well like dedicated nvidia dedicated mid range graphic card like nvidia 1660 and are provided in handheld ROG devices too (You can read and see more news and things about these powerful igpus).
New upcoming laptops are also coming with amd processors with these igpu's which are powerful enough to run 8-13B quantized llm models too.
So is it possible for you to add support for these powerful IGPUs which can run this project without nvidia gpu, if yes then it would be great as more and more people will be able to experience this awesome project.
2. Additional context or comments
No response
3. Can you help us with this feature?
[ ] I am interested in contributing to this feature.
Self Checks
1. Is this request related to a challenge you're experiencing? Tell me about your story.
In today's world we have quite powerful igpu's(Integrated Gpu) like amd ryzen 780M and 880M which are performing very well like dedicated nvidia dedicated mid range graphic card like nvidia 1660 and are provided in handheld ROG devices too (You can read and see more news and things about these powerful igpus). New upcoming laptops are also coming with amd processors with these igpu's which are powerful enough to run 8-13B quantized llm models too. So is it possible for you to add support for these powerful IGPUs which can run this project without nvidia gpu, if yes then it would be great as more and more people will be able to experience this awesome project.
2. Additional context or comments
No response
3. Can you help us with this feature?