0hq / WebGPT

Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~1500 lines of vanilla Javascript.
https://kmeans.org
Other
3.6k stars 206 forks source link

how to use your llm.ts library with this? #37

Open ralyodio opened 1 year ago

ralyodio commented 1 year ago

I want to load models from hugging face....how do I do that?

felladrin commented 1 year ago

It requires you to convert HF models into a specific format, explained in other/conversion_scripts/README.md.

In that folder there's also a script to convert PyTorch checkpoints and another to convert PyTorch pre-trained models.