KoboldAI / KoboldAI-Client

For GGUF support, see KoboldCPP: https://github.com/LostRuins/koboldcpp
https://koboldai.com
GNU Affero General Public License v3.0
3.46k stars 747 forks source link

MLC-LLM Integration? #324

Open ArcturusMayer opened 1 year ago

ArcturusMayer commented 1 year ago

Perhaps it would be a good idea to add support for the new MLC-AI team project https://github.com/mlc-ai/mlc-llm in the future to run on any graphics cards that support the Vulcan API? Just like you added llama.cpp support in the past. For example, I have an RX 570 graphics card that has Vulcan support, but does not have support for current ROCm versions, and has 8 GB VRAM, so for people in a similar situation, this would be very important and useful.