[!NOTE] This project is still in in an alpha stage of development. Significant changes are very likely and backwards compatibility is disregarded.
The Alpaca Core Local SDK, or AC Local for short, is a multi-platform SDK for local AI Inference.
"Local" here means running on the device which executes the code. This could be a server, a desktop, or a mobile device.
It provides a unified API for doing inference with multiple models. The API itself can be split into two layers:
Read the full introduction here.
This list will be updated as new models are added.
ac::local::ModelFactory factory;
ac::local::addLlamaInference(factory);
auto model = factory.createModel(
{
.inferenceType = "llama.cpp",
.assets = {
{.path = "/path/to/model.gguf"}
}
}, {}, {}
);
auto instance = model->createInstance("general", {});
auto result = instance->runOp("run",
{{"prompt", "If you could travel faster than light,"}}, {});
std::cout << result << "\n";
NSDictionary
to ac::Dict
and back in Objective-C++There are multiple examples in this repo. Look for the example
directories throughout the tree. Besides them, there are also are several larger demos in separate repositories:
The repo has submodules. Don't forget to fetch them.
Use CMake. Works as a root or as a subdirectory. Some useful presets are provided in the repo.
Detailed build instructions can be found in the documentation.
This software is distributed under the MIT Software License. See accompanying file LICENSE or copy here.
Copyright © 2024 Alpaca Core, Inc
A list of the third party libraries used here. Please consider supporting them.