netease-youdao / QAnything

Question and Answer based on Anything.
https://qanything.ai
GNU Affero General Public License v3.0
11.54k stars 1.12k forks source link

mac M2 本地环境启动报错 ggml_metal_init: error: Error #309

Open faceAngus opened 5 months ago

faceAngus commented 5 months ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

No response

期望行为 | Expected Behavior

No response

运行环境 | Environment

- OS: mac m2
- NVIDIA Driver:
- CUDA:
- docker:
- docker-compose:
- NVIDIA GPU:
- NVIDIA GPU Memory:

QAnything日志 | QAnything logs

llama_new_context_with_model: n_ctx = 4096 llama_new_context_with_model: n_batch = 512 llama_new_context_with_model: n_ubatch = 512 llama_new_context_with_model: freq_base = 10000.0 llama_new_context_with_model: freq_scale = 1 ggml_metal_init: allocating ggml_metal_init: found device: Apple M2 Max ggml_metal_init: picking default device: Apple M2 Max ggml_metal_init: using embedded metal library ggml_metal_init: error: Error Domain=MTLLibraryErrorDomain Code=3 "program_source:155:11: error: unions are not supported in Metal union { ^ program_source:176:11: error: unions are not supported in Metal union { ^ program_source:197:11: error: unions are not supported in Metal union { ^ program_source:219:11: error: unions are not supported in Metal union { ^ program_source:264:11: error: unions are not supported in Metal union { ^ program_source:291:11: error: unions are not supported in Metal union { ^

复现方法 | Steps To Reproduce

bash scripts/run_for_3B_in_M1_mac.sh 启动不起来

备注 | Anything else?

No response

faceAngus commented 5 months ago

llama_new_context_with_model: failed to initialize Metal backend [2024-05-01 08:06:44 +0800] [2194] [ERROR] Experienced exception while trying to serve Traceback (most recent call last): File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/mixins/startup.py", line 958, in serve_single worker_serve(monitor_publisher=None, kwargs) File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/worker/serve.py", line 143, in worker_serve raise e File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/worker/serve.py", line 117, in worker_serve return _serve_http_1( File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/server/runners.py", line 223, in _serve_http_1 loop.run_until_complete(app._server_event("init", "before")) File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/app.py", line 1764, in _server_event await self.dispatch( File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 208, in dispatch return await dispatch File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 183, in _dispatch raise e File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 167, in _dispatch retval = await maybe_coroutine File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/app.py", line 1315, in _listener await maybe_coro File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/qanything_server/sanic_api.py", line 177, in init_local_doc_qa local_doc_qa.init_cfg(args=args) File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/core/local_doc_qa.py", line 71, in init_cfg self.llm: LlamaCPPCustomLLM = LlamaCPPCustomLLM(args) File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/connector/llm/llm_for_llamacpp.py", line 25, in init self.llm = Llama( File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/llama_cpp/llama.py", line 337, in init self._ctx = _LlamaContext( File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/llama_cpp/_internals.py", line 265, in init raise ValueError("Failed to create llama_context") ValueError: Failed to create llama_context [2024-05-01 08:06:44 +0800] [2194] [INFO] Server Stopped Traceback (most recent call last): File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/qanything_server/sanic_api.py", line 210, in app.run(host=args.host, port=args.port, single_process=True, access_log=False) File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/mixins/startup.py", line 215, in run serve(primary=self) # type: ignore File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/mixins/startup.py", line 958, in serve_single worker_serve(monitor_publisher=None, kwargs) File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/worker/serve.py", line 143, in worker_serve raise e File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/worker/serve.py", line 117, in worker_serve return _serve_http_1( File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/server/runners.py", line 223, in _serve_http_1 loop.run_until_complete(app._server_event("init", "before")) File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/app.py", line 1764, in _server_event await self.dispatch( File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 208, in dispatch return await dispatch File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 183, in _dispatch raise e File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 167, in _dispatch retval = await maybe_coroutine File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/app.py", line 1315, in _listener await maybe_coro File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/qanything_server/sanic_api.py", line 177, in init_local_doc_qa local_doc_qa.init_cfg(args=args) File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/core/local_doc_qa.py", line 71, in init_cfg self.llm: LlamaCPPCustomLLM = LlamaCPPCustomLLM(args) File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/connector/llm/llm_for_llamacpp.py", line 25, in init self.llm = Llama( File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/llama_cpp/llama.py", line 337, in init self._ctx = _LlamaContext( File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/llama_cpp/_internals.py", line 265, in init raise ValueError("Failed to create llama_context") ValueError: Failed to create llama_context

Gavince commented 5 months ago

ValueError: Failed to create llama_context