huggingface / huggingface.js

Utilities to use the Hugging Face Hub API
https://hf.co/docs/huggingface.js
MIT License
1.42k stars 229 forks source link

vLLM Local App Option should only display on GGUF models that are llama.cpp compatible #926

Closed simon-mo closed 1 month ago

simon-mo commented 1 month ago

While vLLM does support gguf with models that have the full config.json this model is not supported. https://huggingface.co/kyutai/moshika-candle-q8

julien-c commented 1 month ago

@simon-mo I think this could be solved by switching this line https://github.com/huggingface/huggingface.js/blob/cc01ed508cfcc49da56f4a7dd761f18180956692/packages/tasks/src/local-apps.ts#L218

from isGgufModel to isLlamaCppGgufModel

WDYT?

cc @Vaibhavs10 too

julien-c commented 1 month ago

@simon-mo please review @Vaibhavs10's PR^ – or if you want to author the change please open a similar PR! 🙏