nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
69.8k stars 7.64k forks source link

"CPU Does Not Meet Requirements" on Snapdragon(R) X 12-core X1E801000 #3026

Open anothywhite opened 6 days ago

anothywhite commented 6 days ago

Bug Report

When launching GPT4All, getting the pop-up: "Encountered an error starting up: "incompatible hardware detected." Unfortunately, your CPU does not meet the minimal requirements to run this program. In particular, it does not support AVX intrinsics which this program requires to successfully run a modern large language model. The only solution at this time is to upgrade your hardware to a more modern CPU."

I'm running this on the high-end Surface Laptop 7 (+Copilot) PC so it's definitely not a "more modern CPU" issue. Apologies if this has been brought up before because I imagine this is a common error (we have five of these computers and it's the same error across the board), but I couldn't find it in the error log.

Steps to Reproduce

Happens on launch of GPT4All and cannot get into anything (was in place for v.3.3.0 and still is for v3.3.1)

Your Environment

brankoradovanovic-mcom commented 6 days ago

Indeed, it appears that Surface laptops can't run AVX instructions.

Non-AVX builds of GPT4All are not planned: #1540. The rationale was that non-AVX CPUs are fairly old now and too slow to be usable for running GPT4All anyway. ARM-based CPUs are an exception, of course, they are brand new, but it seems likely that non-AVX x64 builds - if they existed - would be just as slow to run due to emulation.

manyoso commented 5 days ago

We'd love to support ARM based CPUs such as this, but in order to support it properly one of the developers needs to actually have access to this hardware and unfortunately none of us do at the moment. This is on our roadmap, but it will only come after we've procured the necessary hardware.

Anyone wanting to donate hardware to one of the developers so we can get this to users faster please feel free to reach out :)