janhq / cortex.llamacpp

cortex.llamacpp is a high-efficiency C++ inference engine for edge computing. It is a dynamic library that can be loaded by any server at runtime.
GNU Affero General Public License v3.0
22 stars 3 forks source link

Update llama.cpp submodule to latest release b3943 #257

Closed jan-service-account closed 3 weeks ago

jan-service-account commented 4 weeks ago

This PR updates the llama.cpp submodule to the latest release: b3943.