genesis-ai-dev / codex-editor

Codex Scripture Editor and Translator's Copilot
https://codex-editor.gitbook.io/
MIT License
6 stars 5 forks source link

Decide on initial local LLM server option #17

Closed ryderwishart closed 3 months ago

ryderwishart commented 5 months ago

A smaller model could be leveraged for few-shot, single-token or ngram translation suggestions (e.g., 2-3B params). This is the simplest use case and something we might want to leverage first.

A larger model will be needed for chat, and we need some way to figure out which model the user needs based on their source language settings. This is obviously somewhat fragile, but we can begin to support various gateway languages.

We can actually fire this up through txtai right now to keep things simple, though we will also want to use PredictionGuard for an API version, so when we get to more consistently online users we will likely want a proxy that can gracefully navigate between both.

ryderwishart commented 5 months ago

Hindi

Telugu

Kannada

Tamil

ryderwishart commented 3 months ago

For now we opted to simply use LM Studio. We can reopen this later if we want to mask this from the user too, but it would take a lot of work to rival LM Studio's flexibility at this point.