LostRuins / koboldcpp

Run GGUF models easily with a KoboldAI UI. One File. Zero Install.
https://github.com/lostruins/koboldcpp
GNU Affero General Public License v3.0
4.66k stars 334 forks source link

Format for running a model from command line? #70

Closed Enferlain closed 1 year ago

Enferlain commented 1 year ago

I wanna try the new options like this: koboldcpp.exe --useclblast 0 0 and --smartcontext

Previously when I tried --smartcontext it let me select a model the same way as if I just ran the exe normally, but with the other flag added it now says cannot find model file: and

I saw that I should do [model_file] but [ggml-model-q4_0.bin] and --ggml-model-q4_0.bin doesn't work. What would be the correct format?

Omniphantasm commented 1 year ago

.\koboldcpp.exe --useclblast 0 0 --smartcontext

It'll just pop up the window that lets you select a model file same as always.

Unless I'm misunderstanding what you're trying to do?

LostRuins commented 1 year ago

The command line arguments work for me. In any case, you can also pass in a path the the model as the first argument.

dsiens commented 1 year ago

Coul'd it be possible to change model from UI in future version? Like KoboldAI lite online.

Enferlain commented 1 year ago

@Omniphantasm this worked, thanks.

Any idea what a constant spam of

Error creating OpenCL Buffer A: -4
Error creating OpenCL Buffer B: -4
Error creating OpenCL Buffer C: -4

means?

Omniphantasm commented 1 year ago

Pretty sure it just means you targeted the wrong hardware with --useclblast you'll have to just fiddle around with the numbers after it to find which one is your GPU. 0 1, 1 0, 1 1, etc

Don't think there's an easy way to figure out which to put, for me it's 0 1 to use my GPU and 0 0 uses my iGPU.

Enferlain commented 1 year ago

It's working now with 0 0, was getting the error randomly when I afked but I've been running it for like 4 hours at this point with no issues so yeah. Thanks btw