nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
69.19k stars 7.59k forks source link

Can't download GPT4ALL by CLI #1041

Open guguxh opened 1 year ago

guguxh commented 1 year ago

System Info

Windows 10, Python 3.10.9, GPT4ALL-j-v1.3 image

Information

Related Components

Reproduction

The code is literally: `from gpt4all import GPT4All import copy

llm = GPT4All("ggml-gpt4all-j-v1.3-groovy")`

And some functions after that to prompting and another things.

Expected behavior

Download the Model first and execute the script synchronous

cosmic-snow commented 1 year ago

If your connection is flaky, you can also try downloading a model from the website: https://gpt4all.io/. Scroll down to 'Model Explorer'.

xcaliber123 commented 1 year ago

If your connection is flaky, you can also try downloading a model from the website: https://gpt4all.io/. Scroll down to 'Model Explorer'.

After downloading Models, Which folder Do I Put it?

cosmic-snow commented 1 year ago

After downloading Models, Which folder Do I Put it?

The default folder for the bindings is .cache\gpt4all in your home directory (%USERPROFILE% in cmd or $env:USERPROFILE in PowerShell).

But you can just pass the folder you want to use as a second parameter, too.

xcaliber123 commented 1 year ago

After downloading Models, Which folder Do I Put it?

The default folder for the bindings is .cache\gpt4all in your home directory (%USERPROFILE% in cmd or $env:USERPROFILE in PowerShell).

But you can just pass the folder you want to use as a second parameter, too.

Thanks. I found the problem that the models were not loading in My Chat app as well as on website. After turning on VPN it showed the list of all models. Does those links are banned on some countries?

xcaliber123 commented 1 year ago

After downloading Models, Which folder Do I Put it?

The default folder for the bindings is .cache\gpt4all in your home directory (%USERPROFILE% in cmd or $env:USERPROFILE in PowerShell).

But you can just pass the folder you want to use as a second parameter, too.

I have downloaded other llm models but it has .safetensors and .pt file extension How do I make this Chat App run those model as its only supporting .bin models currently? On windows 10.

cosmic-snow commented 1 year ago

I have downloaded other llm models but it has .safetensors and .pt file extension How do I make this Chat App run those model as its only supporting .bin models currently? On windows 10.

I don't think that's possible. GPT4All supports models only in GGML format, and only the following types (although more are planned): GPT-J, LLaMA, MPT, replit