p-e-w / arrows

A frontend for creative writing with LLMs
GNU Affero General Public License v3.0
103 stars 7 forks source link

Added support for Chat Completion Model #5

Open onlylonly opened 2 months ago

onlylonly commented 2 months ago

I've added support to 'chat' model, and the ability to switch between chat and completion type model.

Added example in config.ts

config.ts:

main.ts:

onlylonly commented 2 months ago

currently PARAMS from config.ts seems to be unused. Should we remove it?

p-e-w commented 2 months ago

Thanks for the effort, but this is not the right way to broaden loader support. The right way is to add support for the text completion endpoint to those loaders (which I believe is currently happening in Ollama). Chat completion is a semantic mismatch for text completion, and using it to do the latter is a hack that I don't want in the code.

The fact that OpenAI restricts GPT-4 to the chat completion endpoint is unfortunate (and clearly intended to further limit what users can do with their models), but not a sufficient reason for doing things the wrong way.

As for local models, they all support text completion ("chat completion" is just text completion with a specific template), so no changes are required to use e.g. Llama 3 Instruct. The only problem is that some loaders, notably Ollama and Kobold, don't expose that endpoint, but that is their bug to fix.

currently PARAMS from config.ts seems to be unused. Should we remove it?

It's not unused, it's included into the params variable, though somehow you seem to have removed it in this PR.