(3.9-JupyterLab) H:\models>python
Python 3.9.9 (tags/v3.9.9:ccb0e6a, Nov 15 2021, 18:08:50) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
from llama_cpp import Llama
llm = Llama(model_path="llama-2-7b-chat.Q2_K.gguf")
Traceback (most recent call last):
File "", line 1, in
File "C:\Users\User\AppData\Local\Programs\3.9-JupyterLab\lib\site-packages\llama_cpp\llama.py", line 332, in init
self.model = llama_cpp.llama_load_model_from_file(
File "C:\Users\User\AppData\Local\Programs\3.9-JupyterLab\lib\site-packages\llama_cpp\llama_cpp.py", line 434, in llama_load_model_from_file
return _lib.llama_load_model_from_file(path_model, params)
OSError: [WinError -1073741795] Windows Error 0xc000001d
llm = Llama(model_path="./llama-2-7b-chat.Q2_K.gguf")
Traceback (most recent call last):
File "", line 1, in
File "C:\Users\User\AppData\Local\Programs\3.9-JupyterLab\lib\site-packages\llama_cpp\llama.py", line 332, in init
self.model = llama_cpp.llama_load_model_from_file(
File "C:\Users\User\AppData\Local\Programs\3.9-JupyterLab\lib\site-packages\llama_cpp\llama_cpp.py", line 434, in llama_load_model_from_file
return _lib.llama_load_model_from_file(path_model, params)
OSError: [WinError -1073741795] Windows Error 0xc000001d
llm = Llama(model_path=".//llama-2-7b-chat.Q2_K.gguf")
Traceback (most recent call last):
File "", line 1, in
File "C:\Users\User\AppData\Local\Programs\3.9-JupyterLab\lib\site-packages\llama_cpp\llama.py", line 332, in init
self.model = llama_cpp.llama_load_model_from_file(
File "C:\Users\User\AppData\Local\Programs\3.9-JupyterLab\lib\site-packages\llama_cpp\llama_cpp.py", line 434, in llama_load_model_from_file
return _lib.llama_load_model_from_file(path_model, params)
OSError: [WinError -1073741795] Windows Error 0xc000001d
llm = Llama(model_path="h://models//llama-2-7b-chat.Q2_K.gguf")
Traceback (most recent call last):
File "", line 1, in
File "C:\Users\User\AppData\Local\Programs\3.9-JupyterLab\lib\site-packages\llama_cpp\llama.py", line 332, in init
self.model = llama_cpp.llama_load_model_from_file(
File "C:\Users\User\AppData\Local\Programs\3.9-JupyterLab\lib\site-packages\llama_cpp\llama_cpp.py", line 434, in llama_load_model_from_file
return _lib.llama_load_model_from_file(path_model, params)
I'll start python 3.9 right within the models folder where the gguf is located windows 10 (not wsl)
from llama_cpp import Llama
Llama(model_path="llama-2-7b-chat.Q2_K.gguf")
also tried
Llama(model_path="./llama-2-7b-chat.Q2_K.gguf")
Llama(model_path="h://models//llama-2-7b-chat.Q2_K.gguf")
Llama(model_path="h:\models\llama-2-7b-chat.Q2_K.gguf")
Llama(model_path="Marx-3B-V2-Q4_1-GGUF.gguf")
(3.9-JupyterLab) H:\models>python Python 3.9.9 (tags/v3.9.9:ccb0e6a, Nov 15 2021, 18:08:50) [MSC v.1929 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information.