mrtrizer / UnityLlamaCpp

Llama.cpp in Unity, straightforward and clean
16 stars 1 forks source link

Does it work on GPU #3

Closed whynames closed 6 months ago

whynames commented 6 months ago

Hey, does it work on gpu?

whynames commented 6 months ago

Is it windows only? I am on mac, and this: EntryPointNotFoundException: llama_backend_init assembly: type: member:(null) Abuksigun.LlamaCpp.LlamaModel+<>c__DisplayClass14_0.b0 () (at Assets/UnityLlamaCpp/LlamaModel.cs:34) System.Threading.Tasks.Task1[TResult].InnerInvoke () (at <04566754dabf4aad92818a224d3e6586>:0) System.Threading.Tasks.Task.Execute () (at <04566754dabf4aad92818a224d3e6586>:0) --- End of stack trace from previous location where exception was thrown --- Abuksigun.LlamaCpp.LlamaModel.LoadModel (System.String modelPath, System.IProgress1[T] progress, System.UInt32 contextSize, System.Int32 gpuLayers) (at Assets/UnityLlamaCpp/LlamaModel.cs:32) Abuksigun.LlamaCpp.LlamaExample.RunAsync () (at Assets/UnityLlamaCpp/LlamaExample.cs:30) System.Runtime.CompilerServices.AsyncMethodBuilderCore+<>c.b7_0 (System.Object state) (at <04566754dabf4aad92818a224d3e6586>:0) UnityEngine.UnitySynchronizationContext+WorkRequest.Invoke () (at /Users/bokken/build/output/unity/unity/Runtime/Export/Scripting/UnitySynchronizationContext.cs:155) UnityEngine.UnitySynchronizationContext.Exec () (at /Users/bokken/build/output/unity/unity/Runtime/Export/Scripting/UnitySynchronizationContext.cs:83) UnityEngine.UnitySynchronizationContext.ExecuteTasks () (at /Users/bokken/build/output/unity/unity/Runtime/Export/Scripting/UnitySynchronizationContext.cs:109)

mrtrizer commented 6 months ago

You would need to build it for Mac using CMake. I didn't have time to build lib for MacOS, sorry.

You need to use this exact release: https://github.com/ggerganov/llama.cpp/releases/tag/b1518 Because in newer versions, context structure has changed and C# binding won't work.

For windows see this issue: https://github.com/mrtrizer/UnityLlamaCpp/issues/2

I also highly recommend using https://github.com/SciSharp/LLamaSharp since I don't have time to support the package.