Closed noknownerrors closed 8 months ago
Hi we recently updated our Android flow and the documentation correspondingly: https://github.com/mlc-ai/mlc-llm/pull/1494. Check if the error still persists.
Besides, this could also be an issue of the model demanding too much memory. Try to tweak prefill_chunk_size
, context_window_size
and sliding_window_size
when compiling the model; for more about these params, see python -m mlc_chat gen_config --help
.
We also print out an estimated memory requirement for each model compiled, something like this would be printed out:
[2024-02-08 20:49:38] INFO model_metadata.py:95: Total memory usage: 3730.64 MB (Parameters: 3615.13 MB. KVCache: 0.00 MB. Temporary buffer: 115.51 MB)
[2024-02-08 20:49:38] INFO model_metadata.py:104: To reduce memory usage, tweak `prefill_chunk_size`, `context_window_size` and `sliding_window_size`
Closing this one for now; feel free to open another one if issues persist!
π Bug
To Reproduce
Steps to reproduce the behavior:
/home/user/documents/mlc-llm/3rdparty/tvm/src/runtime/opencl/opencl_device_api.cc:238: InternalError: Check failed: (err_code == CL_SUCCESS) is false: OpenCL Error, code=-61: CL_INVALID_BUFFER_SIZE 2023-12-04 18:37:04.362 13338-13419 TVM_RUNTIME ai.mlc.mlcchat A /home/user/documents/mlc-llm/3rdparty/tvm/include/tvm/runtime/packed_func.h:1346: unknown type = 0 2023-12-04 18:37:04.362 13338-13419 TVM_RUNTIME ai.mlc.mlcchat A /home/user/documents/mlc-llm/3rdparty/tvm/src/runtime/memory/memory_manager.cc:162: Allocator for 2023-12-04 18:37:04.362 13338-13419 libc++abi ai.mlc.mlcchat E terminating due to uncaught exception of type tvm::runtime::InternalError: [18:37:04] /home/user/documents/mlc-llm/3rdparty/tvm/src/runtime/memory/memory_manager.cc:162: Allocator for Stack trace not available when DMLC_LOG_STACK_TRACE is disabled at compile time. 2023-12-04 18:37:04.364 13338-13419 libc ai.mlc.mlcchat A Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 13419 (pool-3-thread-1), pid 13338 (ai.mlc.mlcchat)
Expected behavior
Chat with the model
Environment
conda
, source): condapip
, source): the wheel in the docspython -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))"
, applicable if you compile models):Additional context