AIAnytime / Search-Your-PDF-App

Search Your PDF App using Langchain, ChromaDB, Sentence Transformers, and LaMiNi LM Model. This app is completely powered by Open Source Models. No OpenAI key is required.
MIT License
52 stars 40 forks source link

Problem #4

Open aurangakhtar opened 10 months ago

aurangakhtar commented 10 months ago

Hello, I've encountered two issues.

  1. I had to specify a model manually (MBZUAI/LaMini-T5-738M).

  2. I received the following error message.

(env) C:\AI\Search-Your-PDF-App>app.py Traceback (most recent call last): File "C:\AI\Search-Your-PDF-App\app.py", line 16, in base_model = AutoModelForSeq2SeqLM.from_pretrained(checkpoint, device_map='auto', torch_dtype=torch.float32) File "C:\AI\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 565, in from_pretrained return model_class.from_pretrained( File "C:\AI\env\lib\site-packages\transformers\modeling_utils.py", line 3307, in from_pretrained ) = cls._load_pretrained_model( File "C:\AI\env\lib\site-packages\transformers\modeling_utils.py", line 3428, in _load_pretrained_model raise ValueError( ValueError: The current device_map had weights offloaded to the disk. Please provide an offload_folder for them. Alternatively, make sure you have safetensors installed if the model you are using offers the weights in this format.

What can I do?

Thank you.

Regards.

Kedar-dave commented 8 months ago

You can explicitly define device_map = "cpu" or "gpu" and retry. This worked for me