Open 0013227 opened 1 year ago
It's locked and I get the following message:
C:\Users\tradi\Documents\VICUNA\oobabooga_windows\installer_files\env\lib\site-packages\bitsandbytes\cextension.py:33: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable. warn("The installed version of bitsandbytes was compiled without GPU support. " INFO:Loading eachadea_ggml-vicuna-7b-1-1... INFO:llama.cpp weights detected: models\eachadea_ggml-vicuna-7b-1-1\ggml-vic7b-uncensored-q5_1.bin
llama.cpp: loading model from models\eachadea_ggml-vicuna-7b-1-1\ggml-vic7b-uncensored-q5_1.bin llama_model_load_internal: format = ggjt v1 (latest) llama_model_load_internal: n_vocab = 32001 llama_model_load_internal: n_ctx = 2048 llama_model_load_internal: n_embd = 4096 llama_model_load_internal: n_mult = 256 llama_model_load_internal: n_head = 32 llama_model_load_internal: n_layer = 32 llama_model_load_internal: n_rot = 128 llama_model_load_internal: ftype = 9 (mostly Q5_1) llama_model_load_internal: n_ff = 11008 llama_model_load_internal: n_parts = 1 llama_model_load_internal: model size = 7B llama_model_load_internal: ggml ctx size = 59.11 KB llama_model_load_internal: mem required = 6612.58 MB (+ 1026.00 MB per state) llama_init_from_file: kv self size = 1024.00 MB AVX = 1 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | VSX = 0 | INFO:Loading the extension "gallery"... Running on local URL: http://127.0.0.1:7860
Any suggestions?
weitianrui@weitianruideMacBook-Pro llama.cpp % npx dalai serve mkdir /Users/weitianrui/dalai Server running on http://localhost:3000/
TypeError: Cannot read properties of undefined (reading 'split') at Dalai.query (/Users/weitianrui/node_modules/dalai/index.js:219:35) at Socket. (/Users/weitianrui/node_modules/dalai/index.js:534:20)
at Socket.emit (node:events:513:28)
at Socket.emitUntyped (/Users/weitianrui/node_modules/socket.io/dist/typed-events.js:69:22)
at /Users/weitianrui/node_modules/socket.io/dist/socket.js:703:39
at process.processTicksAndRejections (node:internal/process/task_queues:77:11)
Node.js v18.16.0 weitianrui@weitianruideMacBook-Pro llama.cpp %