psugihara / FreeChat

llama.cpp based AI chat app for macOS
https://www.freechat.run
MIT License
425 stars 37 forks source link

Allow for gguf selection and multiple imports #45

Closed shavit closed 8 months ago

shavit commented 8 months ago

This change will allow import of only .gguf files, however it will not dismiss the settings window. Maybe users want to manage their models and not start a conversation right away.

Closes https://github.com/psugihara/FreeChat/issues/35

vercel[bot] commented 8 months ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
free-chat ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 9, 2024 8:17pm
psugihara commented 8 months ago

Hey, thanks for the contribution. This works well to limit to ggufs and allow multiple selection, a nice improvement!

However, the intent of #35 was to support the case where I double click a GGUF file.

Rather than this, I want FreeChat to open and load the GGUF as if I had set it in the menu.

Screenshot 2024-01-10 at 11 01 49 AM
psugihara commented 8 months ago

Editing the issue for clarity.

shavit commented 8 months ago

Got it. I'll need to look into appDelegate and handling multiple windows, since in my branch dropping a file opens a new window. Multiple windows will also be necessary for private chat, or maybe using multiple models.

psugihara commented 8 months ago

Thanks for taking another look, excited about this feature!

Probably I'm misunderstanding but I don't think we need multiple windows. I only want one model and one instance of llama server etc to run at a time.

private chat

If you're referring to #7, I imagine this working like any other conversation (in the same window) except that nothing is persisted to Core Data.

shavit commented 8 months ago

Yes, you're right.