Closed bitterspeed closed 2 months ago
I have shared my investigation of this problem on the Electron issue about this matter: https://github.com/electron/electron/issues/41513#issuecomment-2094327194
:tada: This issue has been resolved in version 3.0.0-beta.18 :tada:
The release is available on:
v3.0.0-beta.18
Your semantic-release bot :package::rocket:
Issue description
Electron crashes when loadModel finishes loading (beta)
Expected Behavior
After loading a model using this code and trying to create a context, I'd expect there not to be a crash in
llama-addon.node
. I've tried with and without Metal enabled.If a crash happens, I'd expect to be an error log from node-llama-cpp, but no error logs show up from node-llama-cpp either.
Actual Behavior
When creating context with llama 3 in Electron v28, electron crashes on runtime with an
EXC_CRASH
error (and no console logs)Steps to reproduce
src/main.ts
to load a model.(code below works fine in terminal node.js, which may suggest this is an issue on Electron's side, but I don't know enough to be sure, please let me know if you have ideas on where to start debugging.)
My Environment
node-llama-cpp
versionAdditional Context
No response
Relevant Features Used
Are you willing to resolve this issue by submitting a Pull Request?
Yes, I have the time, but I don't know how to start. I would need guidance.