ml-explore / mlx-swift-examples

Examples using MLX Swift
MIT License
1.03k stars 111 forks source link

Crash when start #101

Closed xlisp closed 3 months ago

xlisp commented 3 months ago

my project code : https://github.com/chanshunli/autogui_llm (follow https://github.com/ml-explore/mlx-swift-examples/tree/main/Applications/LLMEval). I can't import LLM, I can only import MLXLLM, but it gives an error:

image

xlisp commented 3 months ago

When I run real iphone 12, it's will crash , get this erro:

Screenshot 2024-08-10 at 17 15 53

My iPhone 12: image

xlisp commented 3 months ago

I run ModelConfiguration.phi4bit is success, but use other apps like 'local chat' , it's can run gemma2b. why?

davidkoski commented 3 months ago

The first issue is about running on the simulator -- it isn't supported because it doesn't emulate enough of the GPU:

davidkoski commented 3 months ago

As to the second, it is all about memory -- these models are large and take a significant amount of memory and the error you see indicates you exceeded what iOS is letting you use.

I have a iPhone 12 Pro Max and it can run these models, but if you have just a iPhone 12 it may have less memory.

One other thing to check is this entitlement (from here):

Finally, if the device your code runs on has more RAM than the jetsam limit would normally allow, you can use the Increased Memory Limit entitlement.

If all of that is in place but the larger model doesn't work, then your device just doesn't have enough RAM to evaluate it.

xlisp commented 3 months ago

Thanks @davidkoski 🌹