simonw / llm-llama-cpp

LLM plugin for running models using llama.cpp
Apache License 2.0
136 stars 19 forks source link

Typo in README - num_gpu_layers to n_gpu_layers 10 #24

Closed curreta closed 9 months ago

curreta commented 9 months ago

File: README

-o num_gpu_layers 10 - increase the n_gpu_layers argument to a higher value (the default is 1)

-o num_gpu_layers 10 does not work, but -o n_gpu_layers 10 does

simonw commented 9 months ago

Thanks!