PromtEngineer / localGPT

Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
Apache License 2.0
19.79k stars 2.2k forks source link

I get this error #2

Open lelapin123 opened 1 year ago

lelapin123 commented 1 year ago

"Torch not compiled with CUDA enabled" on windows 10

lelapin123 commented 1 year ago

i fixed the issue with a: pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu117

source: https://pytorch.org/get-started/locally/

but it gave me an error later: ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. llama-cpp-python 0.1.48 requires typing-extensions>=4.5.0, but you have typing-extensions 4.4.0 which is incompatible. chromadb 0.3.22 requires typing-extensions>=4.5.0, but you have typing-extensions 4.4.0 which is incompatible.

but i am not in a clean conda environment (i used it before) I was then later able to run the script

MildPanda commented 1 year ago

I couldn't fix it when I ran into this issue. I tried installing pytorch based on their website info but I already meet requirements so I clearly have it installed.

lelapin123 commented 1 year ago

you need to add force reinstall as option: --force-reinstall

seoadsem commented 1 year ago

Python Version: 3.11.3 Video Card is RX 590. PC:

Device name DESKTOP-PC
Processor   AMD Ryzen 9 3900X 12-Core Processor               3.60 GHz
Installed RAM   64.0 GB (63.2 GB usable)
Device ID   643A3F71-DB57-498B-9E3B-DE7F4EAD0571
Product ID  00330-80208-36777-AA716
System type 64-bit operating system, x64-based processor
Pen and touch   No pen or touch input is available for this display

TRIED:

(LocalGPT) E:\ARTIFICIAL-INTELLIGENCE\localGPT>pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu117
Looking in indexes: https://download.pytorch.org/whl/cu117
Requirement already satisfied: torch in e:\anaconda3\envs\localgpt\lib\site-packages (2.0.1)
Requirement already satisfied: torchvision in e:\anaconda3\envs\localgpt\lib\site-packages (0.15.2)
Collecting torchaudio
  Using cached https://download.pytorch.org/whl/cu117/torchaudio-2.0.2%2Bcu117-cp311-cp311-win_amd64.whl (2.5 MB)
Requirement already satisfied: filelock in e:\anaconda3\envs\localgpt\lib\site-packages (from torch) (3.12.0)
Requirement already satisfied: typing-extensions in e:\anaconda3\envs\localgpt\lib\site-packages (from torch) (4.6.2)
Requirement already satisfied: sympy in e:\anaconda3\envs\localgpt\lib\site-packages (from torch) (1.12)
Requirement already satisfied: networkx in e:\anaconda3\envs\localgpt\lib\site-packages (from torch) (3.1)
Requirement already satisfied: jinja2 in e:\anaconda3\envs\localgpt\lib\site-packages (from torch) (3.1.2)
Requirement already satisfied: numpy in e:\anaconda3\envs\localgpt\lib\site-packages (from torchvision) (1.24.3)
Requirement already satisfied: requests in e:\anaconda3\envs\localgpt\lib\site-packages (from torchvision) (2.31.0)
Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in e:\anaconda3\envs\localgpt\lib\site-packages (from torchvision) (9.5.0)
Requirement already satisfied: MarkupSafe>=2.0 in e:\anaconda3\envs\localgpt\lib\site-packages (from jinja2->torch) (2.1.2)
Requirement already satisfied: charset-normalizer<4,>=2 in e:\anaconda3\envs\localgpt\lib\site-packages (from requests->torchvision) (3.1.0)
Requirement already satisfied: idna<4,>=2.5 in e:\anaconda3\envs\localgpt\lib\site-packages (from requests->torchvision) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in e:\anaconda3\envs\localgpt\lib\site-packages (from requests->torchvision) (1.26.6)
Requirement already satisfied: certifi>=2017.4.17 in e:\anaconda3\envs\localgpt\lib\site-packages (from requests->torchvision) (2023.5.7)
Requirement already satisfied: mpmath>=0.19 in e:\anaconda3\envs\localgpt\lib\site-packages (from sympy->torch) (1.3.0)
Installing collected packages: torchaudio
Successfully installed torchaudio-2.0.2+cu117

(LocalGPT) E:\ARTIFICIAL-INTELLIGENCE\localGPT>python run_localGPT.py
load INSTRUCTOR_Transformer
max_seq_length  512
Using embedded DuckDB with persistence: data will be stored in: E:\ARTIFICIAL-INTELLIGENCE\localGPT
Loading checkpoint shards: 100%|██████████████████████████████████████████████████████| 2/2 [00:16<00:00,  8.39s/it]
Xformers is not installed correctly. If you want to use memorry_efficient_attention to accelerate training use the following command to install Xformers
pip install xformers.

TRIED ALSO:

E:\ARTIFICIAL-INTELLIGENCE\localGPT>pip install torch --force-reinstall
Collecting torch
  Using cached torch-2.0.1-cp311-cp311-win_amd64.whl (172.3 MB)
Collecting filelock (from torch)
  Using cached filelock-3.12.0-py3-none-any.whl (10 kB)
Collecting typing-extensions (from torch)
  Using cached typing_extensions-4.6.2-py3-none-any.whl (31 kB)
Collecting sympy (from torch)
  Using cached sympy-1.12-py3-none-any.whl (5.7 MB)
Collecting networkx (from torch)
  Using cached networkx-3.1-py3-none-any.whl (2.1 MB)
Collecting jinja2 (from torch)
  Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting MarkupSafe>=2.0 (from jinja2->torch)
  Using cached MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl (16 kB)
Collecting mpmath>=0.19 (from sympy->torch)
  Using cached mpmath-1.3.0-py3-none-any.whl (536 kB)
Installing collected packages: mpmath, typing-extensions, sympy, networkx, MarkupSafe, filelock, jinja2, torch
  Attempting uninstall: mpmath
    Found existing installation: mpmath 1.3.0
    Uninstalling mpmath-1.3.0:
      Successfully uninstalled mpmath-1.3.0
  Attempting uninstall: typing-extensions
    Found existing installation: typing_extensions 4.6.2
    Uninstalling typing_extensions-4.6.2:
      Successfully uninstalled typing_extensions-4.6.2
  Attempting uninstall: sympy
    Found existing installation: sympy 1.12
    Uninstalling sympy-1.12:
      Successfully uninstalled sympy-1.12
  Attempting uninstall: networkx
    Found existing installation: networkx 3.1
    Uninstalling networkx-3.1:
      Successfully uninstalled networkx-3.1
  Attempting uninstall: MarkupSafe
    Found existing installation: MarkupSafe 2.1.2
    Uninstalling MarkupSafe-2.1.2:
      Successfully uninstalled MarkupSafe-2.1.2
  Attempting uninstall: filelock
    Found existing installation: filelock 3.12.0
    Uninstalling filelock-3.12.0:
      Successfully uninstalled filelock-3.12.0
  Attempting uninstall: jinja2
    Found existing installation: Jinja2 3.1.2
    Uninstalling Jinja2-3.1.2:
      Successfully uninstalled Jinja2-3.1.2
  Attempting uninstall: torch
    Found existing installation: torch 2.0.1
    Uninstalling torch-2.0.1:
      Successfully uninstalled torch-2.0.1
Successfully installed MarkupSafe-2.1.2 filelock-3.12.0 jinja2-3.1.2 mpmath-1.3.0 networkx-3.1 sympy-1.12 torch-2.0.1 typing-extensions-4.6.2

Have:

>>> torch.__version__
'2.0.1+cpu'

STILL GET ERROR:

(LocalGPT) E:\ARTIFICIAL-INTELLIGENCE\localGPT>python run_localGPT.py
load INSTRUCTOR_Transformer
max_seq_length  512
Using embedded DuckDB with persistence: data will be stored in: E:\ARTIFICIAL-INTELLIGENCE\localGPT
Loading checkpoint shards: 100%|██████████████████████████████████████████████████████| 2/2 [00:14<00:00,  7.41s/it]
Xformers is not installed correctly. If you want to use memorry_efficient_attention to accelerate training use the following command to install Xformers
pip install xformers.

Enter a query: hey
Traceback (most recent call last):
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\run_localGPT.py", line 80, in <module>
    main()
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\run_localGPT.py", line 62, in main
    res = qa(query)
          ^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\langchain\chains\base.py", line 140, in __call__
    raise e
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\langchain\chains\base.py", line 134, in __call__
    self._call(inputs, run_manager=run_manager)
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\langchain\chains\retrieval_qa\base.py", line 119, in _call
    docs = self._get_docs(question)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\langchain\chains\retrieval_qa\base.py", line 181, in _get_docs
    return self.retriever.get_relevant_documents(question)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\langchain\vectorstores\base.py", line 366, in get_relevant_documents
    docs = self.vectorstore.similarity_search(query, **self.search_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\langchain\vectorstores\chroma.py", line 181, in similarity_search
    docs_and_scores = self.similarity_search_with_score(query, k, filter=filter)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\langchain\vectorstores\chroma.py", line 227, in similarity_search_with_score
    query_embedding = self._embedding_function.embed_query(query)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\langchain\embeddings\huggingface.py", line 161, in embed_query
    embedding = self.client.encode([instruction_pair])[0]
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\InstructorEmbedding\instructor.py", line 521, in encode
    self.to(device)
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\torch\nn\modules\module.py", line 1145, in to
    return self._apply(convert)
           ^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\torch\nn\modules\module.py", line 820, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\torch\nn\modules\module.py", line 1143, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda3\envs\LocalGPT\Lib\site-packages\torch\cuda\__init__.py", line 239, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
ksylvan commented 1 year ago

Maybe this is your issue? Use python 3.10 instead?

https://stackoverflow.com/questions/75390224/why-pytorch-is-not-installing-on-windows-11-on-python-3-11

seoadsem commented 1 year ago

Maybe this is your issue? Use python 3.10 instead?

https://stackoverflow.com/questions/75390224/why-pytorch-is-not-installing-on-windows-11-on-python-3-11

Simply not working in any way, not even with CPU option and python 3.10.11. Same error.

(LocalGPT) E:\ARTIFICIAL-INTELLIGENCE\localGPT>python ingest.py
Loading documents from E:\ARTIFICIAL-INTELLIGENCE\localGPT/SOURCE_DOCUMENTS
Loaded 1 documents from E:\ARTIFICIAL-INTELLIGENCE\localGPT/SOURCE_DOCUMENTS
Split into 72 chunks of text
load INSTRUCTOR_Transformer
max_seq_length  512
Using embedded DuckDB with persistence: data will be stored in: E:\ARTIFICIAL-INTELLIGENCE\localGPT
Traceback (most recent call last):
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\ingest.py", line 49, in <module>
    main()
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\ingest.py", line 43, in main
    db = Chroma.from_documents(texts, embeddings, persist_directory=PERSIST_DIRECTORY, client_settings=CHROMA_SETTINGS)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 413, in from_documents
    return cls.from_texts(
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 381, in from_texts
    chroma_collection.add_texts(texts=texts, metadatas=metadatas, ids=ids)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 158, in add_texts
    embeddings = self._embedding_function.embed_documents(list(texts))
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\embeddings\huggingface.py", line 148, in embed_documents
    embeddings = self.client.encode(instruction_pairs)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\InstructorEmbedding\instructor.py", line 521, in encode
    self.to(device)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 1145, in to
    return self._apply(convert)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 820, in _apply
    param_applied = fn(param)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 1143, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\cuda\__init__.py", line 239, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

(LocalGPT) E:\ARTIFICIAL-INTELLIGENCE\localGPT>python ingest.py --device_type cpu
Loading documents from E:\ARTIFICIAL-INTELLIGENCE\localGPT/SOURCE_DOCUMENTS
Loaded 1 documents from E:\ARTIFICIAL-INTELLIGENCE\localGPT/SOURCE_DOCUMENTS
Split into 72 chunks of text
load INSTRUCTOR_Transformer
max_seq_length  512
Using embedded DuckDB with persistence: data will be stored in: E:\ARTIFICIAL-INTELLIGENCE\localGPT
Traceback (most recent call last):
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\ingest.py", line 49, in <module>
    main()
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\ingest.py", line 43, in main
    db = Chroma.from_documents(texts, embeddings, persist_directory=PERSIST_DIRECTORY, client_settings=CHROMA_SETTINGS)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 413, in from_documents
    return cls.from_texts(
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 381, in from_texts
    chroma_collection.add_texts(texts=texts, metadatas=metadatas, ids=ids)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 158, in add_texts
    embeddings = self._embedding_function.embed_documents(list(texts))
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\embeddings\huggingface.py", line 148, in embed_documents
    embeddings = self.client.encode(instruction_pairs)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\InstructorEmbedding\instructor.py", line 521, in encode
    self.to(device)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 1145, in to
    return self._apply(convert)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 820, in _apply
    param_applied = fn(param)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 1143, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\cuda\__init__.py", line 239, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

(LocalGPT) E:\ARTIFICIAL-INTELLIGENCE\localGPT>python run_localGPT.py
load INSTRUCTOR_Transformer
max_seq_length  512
Using embedded DuckDB with persistence: data will be stored in: E:\ARTIFICIAL-INTELLIGENCE\localGPT
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 2/2 [00:14<00:00,  7.27s/it]
Xformers is not installed correctly. If you want to use memorry_efficient_attention to accelerate training use the following command to install Xformers
pip install xformers.

Enter a query: hello
Traceback (most recent call last):
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\run_localGPT.py", line 80, in <module>
    main()
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\run_localGPT.py", line 62, in main
    res = qa(query)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\chains\base.py", line 140, in __call__
    raise e
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\chains\base.py", line 134, in __call__
    self._call(inputs, run_manager=run_manager)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\chains\retrieval_qa\base.py", line 119, in _call
    docs = self._get_docs(question)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\chains\retrieval_qa\base.py", line 181, in _get_docs
    return self.retriever.get_relevant_documents(question)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\base.py", line 366, in get_relevant_documents
    docs = self.vectorstore.similarity_search(query, **self.search_kwargs)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 181, in similarity_search
    docs_and_scores = self.similarity_search_with_score(query, k, filter=filter)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 227, in similarity_search_with_score
    query_embedding = self._embedding_function.embed_query(query)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\embeddings\huggingface.py", line 161, in embed_query
    embedding = self.client.encode([instruction_pair])[0]
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\InstructorEmbedding\instructor.py", line 521, in encode
    self.to(device)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 1145, in to
    return self._apply(convert)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 820, in _apply
    param_applied = fn(param)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 1143, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\cuda\__init__.py", line 239, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

(LocalGPT) E:\ARTIFICIAL-INTELLIGENCE\localGPT>python run_localGPT.py --device_type cpu
load INSTRUCTOR_Transformer
max_seq_length  512
Using embedded DuckDB with persistence: data will be stored in: E:\ARTIFICIAL-INTELLIGENCE\localGPT
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 2/2 [00:14<00:00,  7.30s/it]
Xformers is not installed correctly. If you want to use memorry_efficient_attention to accelerate training use the following command to install Xformers
pip install xformers.

Enter a query: hello
Traceback (most recent call last):
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\run_localGPT.py", line 80, in <module>
    main()
  File "E:\ARTIFICIAL-INTELLIGENCE\localGPT\run_localGPT.py", line 62, in main
    res = qa(query)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\chains\base.py", line 140, in __call__
    raise e
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\chains\base.py", line 134, in __call__
    self._call(inputs, run_manager=run_manager)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\chains\retrieval_qa\base.py", line 119, in _call
    docs = self._get_docs(question)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\chains\retrieval_qa\base.py", line 181, in _get_docs
    return self.retriever.get_relevant_documents(question)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\base.py", line 366, in get_relevant_documents
    docs = self.vectorstore.similarity_search(query, **self.search_kwargs)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 181, in similarity_search
    docs_and_scores = self.similarity_search_with_score(query, k, filter=filter)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\vectorstores\chroma.py", line 227, in similarity_search_with_score
    query_embedding = self._embedding_function.embed_query(query)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\langchain\embeddings\huggingface.py", line 161, in embed_query
    embedding = self.client.encode([instruction_pair])[0]
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\InstructorEmbedding\instructor.py", line 521, in encode
    self.to(device)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 1145, in to
    return self._apply(convert)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 820, in _apply
    param_applied = fn(param)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\nn\modules\module.py", line 1143, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
  File "E:\Anaconda3\envs\LocalGPT\lib\site-packages\torch\cuda\__init__.py", line 239, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

(LocalGPT) E:\ARTIFICIAL-INTELLIGENCE\localGPT>python -V
Python 3.10.11
Neo2003 commented 1 year ago

On Gentoo linux I had no problem running it on my old 1070TI, but this requires more than 32Gb of memory to start.