cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.09k stars 1.42k forks source link

"npx dalai llama install 7B" fails with "./quantize : The term './quantize' is not recognized" #448

Open rmangino opened 1 year ago

rmangino commented 1 year ago

Running npx dalai llama install 7B fails with:

PS C:\Users\xxxx\dalai\llama\build\Release> [System.Console]::OutputEncoding=[System.Console]::InputEncoding=[System.Text.Encoding]::UTF8; ./quantize C:\Users\xxxx\dalai\llama\models\7B\ggml-model-f16.bin C:\Users\xxxx\dalai\llama\models\7B\ggml-model-q4_0.bin 2

`./quantize : The term './quantize' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. At line:1 char:96

npecko commented 1 year ago

While the install is running, before reaching this command step, manually go into the 'build\bin\Release' dir and move the three files from there to 'build\Release'. After that, the quantize can do its job and it won't fail.