alpaca-core / ilib-llama.cpp

Alpaca Core wrapper and plugin for llama.cpp
MIT License
0 stars 0 forks source link

llama: add multiple prompt processing #7

Open iboB opened 3 months ago

iboB commented 3 months ago

Intentionally skipped while implementing alpaca-core/ac-local#3

This is done by setting llama_context_params::n_seq_max to a value greater than one.

Additionally we should devise a good way to expose this from both the module API and the SDK API.