undreamai / LlamaLib

MIT License
9 stars 1 forks source link

llama.cpp library for UndreamAI

LlamaLib implements an API for the llama.cpp server. The focus of this project is to:

Each release contains:

The following architectures are provided:

In addition the windows-archchecker and linux-archchecker libraries are used to determine the presence and type of AVX instructions in Windows and Linux.

The server CLI startup guide can be accessed by running the command .\undreamai_server -h on Linux/macOS or undreamai_server.exe -h on Windows for the architecture of interest.
More information on the different options can be found on the llama.cpp server Readme.

The server binaries can be used to deploy remote servers for LLMUnity.
You can print the required command within Unity by running the scene.
More information can be found at the Use a remote server section of the LLMUnity Readme.