fwcd / d2

Command-based virtual assistant for Discord and other platforms
GNU General Public License v3.0
16 stars 5 forks source link

Investigate integrating a local language model #165

Open fwcd opened 1 week ago

fwcd commented 1 week ago

Highly quantized language models that can run locally are getting more and more popular with even Chrome shipping a Gemini Nano model in their latest canary builds. Models like Phi-3-mini already achieve impressive performance for being comparatively small and support cross-platform inference using a Rust library named candle.

It would be cool if we could bundle such a model with D2, e.g. as a command and/or as a Conversator.

fwcd commented 1 week ago

llama.cpp and llama.swift would be worth investigating, even though the latter might be primarily targeting Apple platforms.