polygon / gpt4all-nix

Nix flake for gpt4all
39 stars 9 forks source link

Downloading models doesn't work #5

Open NerveCoordinator opened 10 months ago

NerveCoordinator commented 10 months ago

I attempted to download mistral-7b-openorca.Q4_0.gguf multiple times. The download completed, but then different error states happened randomly:

== Failed to open log file, logging to stdout... [Debug] (Thu Oct 26 13:53:19 2023): deserializing chats took: 0 ms [Debug] (Thu Oct 26 13:53:19 2023): deserializing chats took: 0 ms [Warning] (Thu Oct 26 13:54:10 2023): Opening temp file for writing: "/home/username/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Thu Oct 26 13:54:10 2023): Opening temp file for writing: "/home/username/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Thu Oct 26 13:55:37 2023): qrc:/gpt4all/qml/ModelDownloaderDialog.qml:389:29 Parameter "link" is not declared. Injection of parameters into signal handlers is deprecated. Use JavaScript functions with formal parameters instead. [Warning] (Thu Oct 26 13:55:37 2023): qrc:/gpt4all/qml/ModelDownloaderDialog.qml:389:29 Parameter "link" is not declared. Injection of parameters into signal handlers is deprecated. Use JavaScript functions with formal parameters instead. [Warning] (Thu Oct 26 13:56:28 2023): Opening temp file for writing: "/home/username/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Thu Oct 26 13:56:28 2023): Opening temp file for writing: "/home/username/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Thu Oct 26 13:58:05 2023): Opening temp file for writing: "/home/username/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Thu Oct 26 13:58:05 2023): Opening temp file for writing: "/home/username/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Thu Oct 26 13:58:05 2023): "ERROR: Downloading failed with code 299 \"Error transferring https://gpt4all.io/models/gguf/mistral-7b-openorca.Q4_0.gguf - server replied: \"" [Warning] (Thu Oct 26 13:58:05 2023): "ERROR: Downloading failed with code 299 \"Error transferring https://gpt4all.io/models/gguf/mistral-7b-openorca.Q4_0.gguf - server replied: \"" [Warning] (Thu Oct 26 13:58:21 2023): Opening temp file for writing: "/home/username/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Thu Oct 26 13:58:21 2023): Opening temp file for writing: "/home/username/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf"

== However downloading the model from the_bloke here and putting it in /home/username/.local/share/nomic.ai/GPT4All/ worked just fine as a workaround.

polygon commented 10 months ago

You may want to raise this issue in the actual upstream repository. This repo is just containing packaging for NixOS distribution: https://github.com/nomic-ai/gpt4all

NerveCoordinator commented 10 months ago

So, I'm currently only on NixOS, and this tends to mean I get weird problems which I don't get on say Ubuntu. So I felt less sure about posting it there. Since it seemed like a pretty obvious immediate inability to use the thing I deemed it unlikely to be a general problem.

So I figured: 1) If it is not a NixOS specific issue then someone else will notice it elsewhere and point it out and fix it there 2) If it is a NixOS specific issue then the next person who has NixOS and encounters it can see this here and see my workaround or add context or otherwise.

I am admittedly unsure of the etiquette or ideal decision path for this sort of thing. I guess I should have attempted to run it on a non-NixOS build and figure out if it was in the main repo too?

polygon commented 10 months ago

Does the issue persist? There have been some updates to the URL list that should be included now in the build used in this flake.

You can also try updating the flake inputs with nix flake update to build the latest version from upstream. In the past, this used to often break the build because the upstream author included new dependencies that required a change of the derivation. But recently things seem to have mostly stabilized.

aleeusgr commented 9 months ago

yes it does, although requires some dancing:

Using the internal loader gives

error 401

```shell [Warning] (Sun Dec 10 19:20:56 2023): Opening temp file for writing: "/home/alex/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Sun Dec 10 19:25:42 2023): stream 21 finished with error: "Internal server error" [Warning] (Sun Dec 10 19:25:42 2023): Opening temp file for writing: "/home/alex/.local/share/nomic.ai/GPT4All/incomplete-mistral-7b-openorca.Q4_0.gguf" [Warning] (Sun Dec 10 19:25:42 2023): "ERROR: Downloading failed with code 401 \"Internal server error\"" [Warning] (Sun Dec 10 19:40:58 2023): stream 23 finished with error: "Internal server error" [Warning] (Sun Dec 10 19:40:58 2023): "ERROR: Network error occurred attempting to download mistral-7b-openorca.Q4_0.gguf code: 401 errorString Internal server error" [Warning] (Sun Dec 10 19:40:58 2023): QNetworkReplyImplPrivate::error: Internal problem, this method must only be called once. ```

there is also similar Issue in the upstream: https://github.com/nomic-ai/gpt4all/issues/1381