mlc-ai / web-llm-chat

Chat with AI large language models running natively in your browser. Enjoy private, server-free, seamless AI conversations.
https://chat.webllm.ai/
Apache License 2.0
224 stars 40 forks source link

[Feature Request]: Background sync of models #54

Open sdmorrey opened 1 month ago

sdmorrey commented 1 month ago

Problem Description

When I start a new chat often I want to chat a different model than the last one I talked to. At present this is handled via a dropdown with an inscrutible list of names I have to try and mentally translate. Worse I have to remember which one(s) I've already downloaded. This is a huge amount of cognitive load before my morning coffee kicks in.

While that's not so bad, what makes it truly problematic is that at present we don't start downloading a model until it's needed for the first time.

The download process at least on my end seems to not detect when there's been an interruption to the download process.

The end result is if I want to chat a new model I have to sit and stare waiting for it to finish loading and often I have to manually stop and restart the download. Often this occurs because I selected the wrong variant of a model I already downloaded.

Solution Description

A better solution would be to have a configuration option with a list of models available for download, their sizes in GB and a description or at least a link to their HF page. A simple checkbox would do. From there the model(s) would download in the background and it would be lovely if the app could restart failed downloads automatically.

This would replace the current model select in chat, with a list of only the available, already downloaded models. That way I don't start chatting a model I don't actually have and wait ages for it to download.

Alternatives Considered

No response

Additional Context

Great Job so far by the way!

Neet-Nestor commented 1 month ago

Thanks for the suggestion. This is very similar to something on our roadmap and thank you for bringing some additional valuable points to it.

Due to the considerable amount of work for this, it won't be there very soon but it will be there sometimes in the future :)

sdmorrey commented 1 month ago

Maybe I can pitch in soon. I've been planning to build something like a BitTorrent for AI for awhile now.