cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.09k stars 1.43k forks source link

ModuleNotFoundError: No module named 'numpy' #423

Closed Z3r01mPact closed 1 year ago

Z3r01mPact commented 1 year ago

PS C:\Users\XXX\dalai\llama> [System.Console]::OutputEncoding=[System.Console]::InputEncoding=[System.Text.Encoding]::UTF8; C:\Users\XXX\dalai\venv\Scripts\python.exe convert-pth-to-ggml.py models/13B/ 1 Traceback (most recent call last): File "C:\Users\XXX\dalai\llama\convert-pth-to-ggml.py", line 22, in import numpy as np ModuleNotFoundError: No module named 'numpy' PS C:\Users\XXX\dalai\llama> exit exec: ./quantize C:\Users\XXX\dalai\llama\models\13B\ggml-model-f16.bin C:\Users\XXX\dalai\llama\models\13B\ggml-model-q4_0.bin 2 in C:\Users\XXX\dalai\llama\build\Release caught error Error: Cannot create process, error code: 267 at new WindowsPtyAgent (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\node-pty\lib\windowsPtyAgent.js:105:43) at new WindowsTerminal (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\node-pty\lib\windowsTerminal.js:50:24) at Object.spawn (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\node-pty\lib\index.js:28:12) at C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:562:31 at new Promise () at Dalai.exec (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:555:12) at LLaMA.quantize (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\llama.js:121:23) at LLaMA.add (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\llama.js:102:18) at async Dalai.install (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:350:15)

PS D:\VMs\dalai> python -V Python 3.11.3 PS D:\VMs\dalai> PS D:\VMs\dalai> python -m pip show numpy Name: numpy Version: 1.23.5 Summary: NumPy is the fundamental package for array computing with Python. Home-page: https://www.numpy.org Author: Travis E. Oliphant et al. Author-email: License: BSD Location: C:\Python311\Lib\site-packages Requires: Required-by: edgetpu, h5py, jax, ml-dtypes, opt-einsum, pandas, scikit-learn, scipy, tensorboard, tensorflow-intel PS D:\VMs\dalai>

Z3r01mPact commented 1 year ago

PS D:\VMs\dalai> pip3 install numpy Requirement already satisfied: numpy in c:\python311\lib\site-packages (1.23.5) PS D:\VMs\dalai> pip install numpy Requirement already satisfied: numpy in c:\python311\lib\site-packages (1.23.5) PS D:\VMs\dalai> PS D:\VMs\dalai> PS D:\VMs\dalai> python -V Python 3.11.3 PS D:\VMs\dalai> python3 -V Python 3.10.11 PS D:\VMs\dalai>

Z3r01mPact commented 1 year ago

so within python if i "import numpy as np" it works. if i python3 and "import numpy as np" it does not work

Yet the error generated from "python.exe convert-pth-to-ggml.py" is using python not python3

anyone seen this ??

Z3r01mPact commented 1 year ago

PS D:\VMs\dalai\llama> pip3 install torch Collecting torch Downloading torch-2.0.0-cp311-cp311-win_amd64.whl (172.3 MB) ---------------------------------------- 172.3/172.3 MB 18.7 MB/s eta 0:00:00 Requirement already satisfied: filelock in c:\users\XXX\appdata\roaming\python\python311\site-packages (from torch) (3.10.6) Requirement already satisfied: typing-extensions in c:\python311\lib\site-packages (from torch) (4.5.0) Collecting sympy (from torch) Using cached sympy-1.11.1-py3-none-any.whl (6.5 MB) Collecting networkx (from torch) Downloading networkx-3.1-py3-none-any.whl (2.1 MB) ---------------------------------------- 2.1/2.1 MB 26.4 MB/s eta 0:00:00 Requirement already satisfied: jinja2 in c:\python311\lib\site-packages (from torch) (3.1.2) Requirement already satisfied: MarkupSafe>=2.0 in c:\python311\lib\site-packages (from jinja2->torch) (2.1.2) Collecting mpmath>=0.19 (from sympy->torch) Using cached mpmath-1.3.0-py3-none-any.whl (536 kB) Installing collected packages: mpmath, sympy, networkx, torch Successfully installed mpmath-1.3.0 networkx-3.1 sympy-1.11.1 torch-2.0.0

reran the .py script again...................................

PS D:\VMs\dalai\llama> python.exe convert-pth-to-ggml.py Traceback (most recent call last): File "D:\VMs\dalai\llama\convert-pth-to-ggml.py", line 25, in from sentencepiece import SentencePieceProcessor ModuleNotFoundError: No module named 'sentencepiece'

pip3 install sentencepiece

reran the .py script again................................... PS D:\VMs\dalai\llama> python.exe convert-pth-to-ggml.py usage: convert-pth-to-ggml.py [-h] dir_model {0,1} [vocab_only] convert-pth-to-ggml.py: error: the following arguments are required: dir_model, ftype PS D:\VMs\dalai\llama>

reran the entire " PS D:\VMs\dalai\llama> npx dalai llama install 13B --prefix D:\VMs\ " again

Still bitches " data C:\Users\XXX\dalai\venv\Scripts\cmake : The term 'C:\Users\XXX\dalai\venv\Scripts\cmake' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again."

but proceeds to download models... again... ( should be checking file version and not downloading all over again.. buy hey ho!!)

Z3r01mPact commented 1 year ago

PS C:\Users\XXX\dalai\llama> [System.Console]::OutputEncoding=[System.Console]::InputEncoding=[System.Text.Encoding]::UTF8; C:\Users\XXX\dalai\venv\Scripts\python.exe convert-pth-to-ggml.py models/13B/ 1 Traceback (most recent call last): File "C:\Users\XXX\dalai\llama\convert-pth-to-ggml.py", line 22, in import numpy as np ModuleNotFoundError: No module named 'numpy' PS C:\Users\XXX\dalai\llama> exit exec: ./quantize C:\Users\XXX\dalai\llama\models\13B\ggml-model-f16.bin C:\Users\XXX\dalai\llama\models\13B\ggml-model-q4_0.bin 2 in C:\Users\XXX\dalai\llama\build\Release caught error Error: Cannot create process, error code: 267 at new WindowsPtyAgent (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\node-pty\lib\windowsPtyAgent.js:105:43) at new WindowsTerminal (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\node-pty\lib\windowsTerminal.js:50:24) at Object.spawn (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\node-pty\lib\index.js:28:12) at C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:562:31 at new Promise () at Dalai.exec (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:555:12) at LLaMA.quantize (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\llama.js:121:23) at LLaMA.add (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\llama.js:102:18) at async Dalai.install (C:\Users\XXX\AppData\Local\npm-cache_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:350:15)

Nope.. something going wrong still....

Z3r01mPact commented 1 year ago

anyone know what arguments are being passed to "C:\Users\XXX\dalai\llama\convert-pth-to-ggml.py" after its done downloading the models?

i tried running as python3 and it complains it can't find numpy, but running now as python it complains " convert-pth-to-ggml.py: error: the following arguments are required: dir_model, ftype "

Z3r01mPact commented 1 year ago

ok, so i went a different route, i managed to get a docker compose running that sets it all up and downloaded 30b working, albeit slowly

Z3r01mPact commented 1 year ago

interesting journey, but far more complex than it should be