Closed jmtatsch closed 1 year ago
Currently there is a bug https://github.com/abetlen/llama-cpp-python/issues/110 that will freeze llama_cpp if n_threads > number of (hyper) threads. Thus set threads to a lower value all users will certainly have.
Currently there is a bug https://github.com/abetlen/llama-cpp-python/issues/110 that will freeze llama_cpp if n_threads > number of (hyper) threads. Thus set threads to a lower value all users will certainly have.