mlc-ai / mlc-llm

Universal LLM Deployment Engine with ML Compilation
https://llm.mlc.ai/
Apache License 2.0
19.23k stars 1.58k forks source link

[Question] How to show model progress download on WebLLM Javascript SDK? #3014

Closed DenisSergeevitch closed 1 week ago

DenisSergeevitch commented 2 weeks ago

❓ General Questions

Im using this example: https://jsfiddle.net/neetnestor/4nmgvsa2/

And I can't find any information on how can I show to the client: 1) How much data would be dowloaded in Mb's 2) What progress of the model downloading is via progress bar

Is this functionality generally not available via WebLLM Javascript SDK?

Neet-Nestor commented 1 week ago

Hi Denis!

The total size to download and the current progress of model downloading can be retrieved using the initProgressCallback function when you load/reload the engine. In this jsfiddle example, check the following code snippet:

function updateEngineInitProgressCallback(report) {
  console.log("initialize", report.progress);
  document.getElementById("download-status").textContent = report.text;
}

You can create loading pregress bar using the report object. Here is an example: https://github.com/mlc-ai/web-llm-assistant/blob/main/src/popup.ts#L33

If you want to get the total download sizes for each model before even starting loading the engine, you can check the corresponding HuggingFace repo for total repo size. Model repos can be found in webllm/config.ts: https://github.com/mlc-ai/web-llm/blob/main/src/config.ts#L312 . If you need this information, please let us know and we can also directly add it into the SDK so you can directly read it.

Please let me know if this resolves your need, or do you need additional functionality from the SDK.