Closed arnaudbreton closed 1 month ago
I think the closest thing to this that will be available is this repo: https://github.com/huggingface/chat-macOS (cc @cyrilzakka )
Otherwise you can always setup a chat-ui deployment at home that runs the local models that you want and access it remotely using a mobile device. But I assume you're more interested in on-device inference ?
Hi @nsarrazin , thanks for following up!
Indeed, here I'm looking to run on-device inference, especially on iOS. That's why the closest example I'm aware of is the PocketPal app.
On macOS, I'm using (Open WebUI)[https://openwebui.com/] and Ollama, which so far is working great.
The iOS app HuggingChat is not open source so I'm going to close this for now, not sure if there's plans to change this at the moment.
This repo is for the webapp chat-ui :smile:
Describe your feature request
I was looking for an open-source alternative to PocketPal, which allows to converse with local models on iOS and Android https://apps.apple.com/us/app/pocketpal-ai/id6502579498 and I was wondering if HuggingChat could be this alternative? The idea is to have an e2e open-source solution, providing e2e privacy.
I hope I didn't miss anything in the app allowing to support this.
Thanks
Screenshots (if relevant)
Implementation idea
I'm happy to help provided support from the community and the HuggingFace team. I have experience on web development, but not with running LLM on mobile.