alpaca-core / ac-local

Alpaca Core local inference SDK
MIT License
1 stars 0 forks source link

Alpaca Core Local SDK

License Standard Build

[!NOTE] This project is still in in an alpha stage of development. Significant changes are very likely and backwards compatibility is disregarded.

The Alpaca Core Local SDK, or AC Local for short, is a multi-platform SDK for local AI Inference.

"Local" here means running on the device which executes the code. This could be a server, a desktop, or a mobile device.

It provides a unified API for doing inference with multiple models. The API itself can be split into two layers:

Read the full introduction here.

Supported models

This list will be updated as new models are added.

Minimal Example

ac::local::ModelFactory factory;
ac::local::addLlamaInference(factory);

auto model = factory.createModel(
    {
        .inferenceType = "llama.cpp",
        .assets = {
            {.path = "/path/to/model.gguf"}
        }
    }, {}, {}
);

auto instance = model->createInstance("general", {});

auto result = instance->runOp("run",
    {{"prompt", "If you could travel faster than light,"}}, {});

std::cout << result << "\n";

Bindings, Wrappers, and Integrations

Demos

There are multiple examples in this repo. Look for the example directories throughout the tree. Besides them, there are also are several larger demos in separate repositories:

Build

The repo has submodules. Don't forget to fetch them.

Use CMake. Works as a root or as a subdirectory. Some useful presets are provided in the repo.

Detailed build instructions can be found in the documentation.

License

License

This software is distributed under the MIT Software License. See accompanying file LICENSE or copy here.

Copyright © 2024 Alpaca Core, Inc

Third Party Libraries

A list of the third party libraries used here. Please consider supporting them.