ngxson / wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
441 stars 21 forks source link

Error when loading a model via relative path #63

Closed felladrin closed 5 months ago

felladrin commented 5 months ago
Screenshot 2024-06-05 at 00 13 20

Originally posted by @flatsiedatsie in https://github.com/ngxson/wllama/issues/61#issuecomment-2148507658


It seems wllama will always fail to load models via relative path due to this line:

https://github.com/ngxson/wllama/blob/041e2abfa4335b0e765d676156b5ecd1b894ff1b/src/downloader/multi-downloads.ts#L48

For example:

new URL("/models/mermaid/Mermaid-Llama-3-3B-Pruned.04_K_M-00001-of-00008-gguf")

image
ngxson commented 5 months ago

Yes this is because I leave this code as-is when I copied WebBlob implementation from huggingface.js.

I think we can get rid of URL and simply use string here. I can assign this issue to you if you want. What do you think @felladrin ?

felladrin commented 5 months ago

Oh yeah, I can remove the URL, test it, and then create a pull request. You may assign it to me.

flatsiedatsie commented 5 months ago

Whoop! Time for a new version release? I was implementing the simpler approach this enabled today, but had forgotten that it wasn't in the release yet. Loving it though, it makes me code a lot simpler.