Closed GeeksikhSecurity closed 7 months ago
This is not a bug. As per the README:
Then, download the LLM model and place it in a directory of your choice:
LLM: default to ggml-gpt4all-j-v1.3-groovy.bin. If you prefer a different GPT4All-J compatible model, just download it and reference it in your .env file.
Download that file and put it in a new folder called models
Create a folder called "models" and download ggml-gpt4all-j-v1.3-groovy.bin in to the folder
Thanks for the insight and your help!
Regards,
Gurvinder Singh
Linkedin: www.linkedin.com/in/gurvindersinghb http://hwww.linkedin.com/in/gurvindersinghb
Twitter: twitter.com/gurvindersinghb https://twitter.com/gurvindersinghb
The aim of life, according to the Sikh Gurus, is not to get salvation or a heavenly abode called Paradise, but to develop the best in us which is God.
Principal Teja Singh in his essay "Outline of Sikh Doctrines"
From: Ravi @.> Sent: Monday, May 22, 2023 1:25 AM To: imartinez/privateGPT @.> Cc: GeeksikhSecurity @.>; Author @.> Subject: Re: [imartinez/privateGPT] privateGPT.py fails with models/ggml-gpt4all-j-v1.3-groovy.bin not found! (Issue #351)
Create a folder called "models" and download ggml-gpt4all-j-v1.3-groovy.bin https://gpt4all.io/models/ggml-gpt4all-j-v1.3-groovy.bin in to the folder
— Reply to this email directly, view it on GitHub https://github.com/imartinez/privateGPT/issues/351#issuecomment-1556558112 , or unsubscribe https://github.com/notifications/unsubscribe-auth/AI2PKCXGWYXEHBPBCB6JKYLXHL2BZANCNFSM6AAAAAAYJYO2VY . You are receiving this because you authored the thread. https://github.com/notifications/beacon/AI2PKCTW2JX6VK4N4FTPEPLXHL2BZA5CNFSM6AAAAAAYJYO2V2WGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTS4Y4YSA.gif Message ID: @. @.> >
This works so long as langchain is on langchain==0.0.306 or below just FYI
Note: if you'd like to ask a question or open a discussion, head over to the Discussions section and post it there.
Describe the bug and how to reproduce it PrivateGPT.py fails with model not found. The ingest worked and created files in db folder.
Path not found issues for the model even when defined in the environment variable. Any suggestions would be appreciated:
C:\Users\geek\AppData\Roaming\Python\Python310\site-packages\langchain\llms\gpt4all.py", line 169, in validate_environment values["client"] = GPT4AllModel( File "C:\Users\geek\AppData\Roaming\Python\Python310\site-packages\pygpt4all\models\gpt4all_j.py", line 47, in init super(GPT4All_J, self).init(model_path=model_path, File "C:\Users\geek\AppData\Roaming\Python\Python310\site-packages\pygptj\model.py", line 58, in init raise Exception(f"File {model_path} not found!") Exception: File models/ggml-gpt4all-j-v1.3-groovy.bin not found! GPT4All dir
Mode LastWriteTime Length Name
d----- 5/21/2023 4:35 PM cache d----- 5/21/2023 9:21 PM db -a---- 5/21/2023 4:37 PM 3785248281 ggml-gpt4all-j-v1.3-groovy.bin -a---- 5/21/2023 6:32 PM 7955935232 incomplete-ggml-gpt4all-l13b-snoozy.bin -a---- 5/21/2023 4:35 PM 0 test_write.txt
GPT4All dir db