Open lightaime opened 3 months ago
Hey @lightaime , this model is supported by ollama, should we do native integration? refer: https://ollama.com/library/smollm
The difference between WebLLM and LLM Web APP:
Note: Ignore browser dependency on GPU
Here are the key features of WebLLM :
Required prerequisites
Motivation
Add
SmolLM
https://huggingface.co/blog/smollm, https://huggingface.co/spaces/HuggingFaceTB/SmolLM-360M-Instruct-WebGPU. And WebGPU support.https://webllm.mlc.ai/
Solution
No response
Alternatives
No response
Additional context
No response