SciSharp / LLamaSharp

A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.
https://scisharp.github.io/LLamaSharp
MIT License
2.71k stars 350 forks source link

October binary update #940

Closed martindevans closed 1 month ago

martindevans commented 1 month ago

Updated binaries to llama.cpp c35e586ea57221844442c65a1172498c54971cb0, built with https://github.com/SciSharp/LLamaSharp/actions/runs/10984319249

This is an unusually large change due to a redesign of the llama.cpp sampling API. The new sampling system has all been exposed in a safe way except for custom samplers, which are not implemented and will need to be tackled in a future update.

Assorted notable changes:

Testing:

m0nsky commented 1 month ago

Test application is running fine on:

I noticed seed was removed from the ModelParams, how can we set the seed after this PR?

Edit I see there's a SetSeed() in LLamaContext. I tried using context.SetSeed(1337); but it throws System.AggregateException: One or more errors occurred. (Unable to find an entry point named 'llama_set_rng_seed' in DLL 'llama'.)

martindevans commented 1 month ago

I noticed seed was removed from the ModelParams, how can we set the seed after this PR?

The seed is now set per-sampler, see my other comment for an example.

I'm glad you mentioned this though, because I didn't pass that through to the higher level! I'll add a Seed property to DefaultSamplingPipeline, which is the more direct replacement for ModelParams.

there's a SetSeed() in LLamaContext

Good catch, that method should have been removed too.

m0nsky commented 1 month ago

Can confirm the Seed in the DefaultSamplingPipeline now works as expected!

SignalRT commented 1 month ago

Test is running fine on: