Mozilla-Ocho / llamafile

Distribute and run LLMs with a single file.
https://llamafile.ai
Other
16.75k stars 830 forks source link

instruct chat templates #451

Open CrispStrobe opened 1 month ago

CrispStrobe commented 1 month ago

Prerequisites

Feature Description

would be nice to directly support a number of templates(besides Alpaca instruct).

Motivation

this would facilitate eg easy api usage and inclusion in pipelines that invoke open ai compatible inference servers.

Possible Implementation

could be done like llama.cpp does from #5538 on, which probably could be largely merged in and wrapped?