mlc-ai / web-llm-chat

Chat with AI large language models running natively in your browser. Enjoy private, server-free, seamless AI conversations.
https://chat.webllm.ai/
Apache License 2.0
380 stars 58 forks source link

[Feature Request]: support static builds #46

Closed waldenn closed 4 months ago

waldenn commented 4 months ago

Problem Description

I would like to a static version of the web app (if possible!).

Adding "output: export" in "next.config.mjs", complains about the header, and after removing some of those, the build still gives an error. Not sure how to fix this.

Solution Description

Would be nice if this static build mode was supported properly, as I think this app should be able to be fully local also. Thanks for a great project!

Neet-Nestor commented 4 months ago

It's already supported. Just use yarn export instead of yarn build.

It may complain about the header but it won't affect the build I think.