gbtb / nix-stable-diffusion

Flake for running SD on NixOS
113 stars 21 forks source link

vNext #17

Closed gbtb closed 1 year ago

gbtb commented 1 year ago

To-Do list

max-privatevoid commented 1 year ago

Adding some TODOs I'm planning to do myself or with @MatthewCroughan here

gbtb commented 1 year ago

Hello, max :wave: Thanks for sharing your plans here.

Fix AMD GPU support without using torch-bin

I guess, it practically means using upstream ROCM-packages ? I think using source packages is good idea as long as it doesn't mean that each user has to recompile torch from sources, because it takes a lot of time.

Get rid of submodule usage

I've started to use submodules because at the time these UIs (and even some transitive dependencies) had impure behavior and expected writable repo directories for model caching and no simple path override mechanisms were present. Probably, since then the situation has improved, especially for InvokeAI.

Reduce/remove overlays

Most overlays I used was because of missing packages, upstreaming some popular ones to nixpkgs might be a good way to reduce overlays size :wink:

MatthewCroughan commented 1 year ago

Hacked on this with @max-privatevoid at FOSDEM, a plan I have is to make a flake for AI projects, as we also managed to package https://github.com/KoboldAI/KoboldAI-Client, we also want to put the hugging face models into /nix/store by using a git clone rather than allowing the program to perform the fetching at runtime.

crertel commented 1 year ago

Y'all are awesome! Great work!

MatthewCroughan commented 1 year ago

I set up https://github.com/nixified-ai with @max-privatevoid to see if we can make this an open source project that people will contribute to. Right now the main two outputs are invokeai-amd and invokeai-nvidia.

gbtb commented 1 year ago

I set up https://github.com/nixified-ai with @max-privatevoid to see if we can make this an open source project that people will contribute to. Right now the main two outputs are invokeai-amd and invokeai-nvidia.

I think there is actually a pretty good chance folks will come and participate. During ~4 months I had this repo open, couple of people came by and contribute most of the work needed to run Automatic webui. I've just had to help them finish some hard stuff. If there will be good foundation, guidelines and support tooling (e.g. pynixify) it's even more likely.

arcnmx commented 1 year ago
  • Get rid of submodule usage
  • Reduce/remove overlays

FWIW I've just thrown up some of my personal setup of this from last year here (diff) I'll have to take a look at how it compares to the nixified-ai flake!

I've started to use submodules because at the time these UIs (and even some transitive dependencies) had impure behavior and expected writable repo directories for model caching and no simple path override mechanisms were present.

Patches help here, but these are services that require at least a state/config directory. The best you generally can do is provide a skeleton dir and (optionally) prepopulate it as part of a wrapper (alternatively, provide a nixos module that does this for you in /var/lib/webui via tmpfiles and sets up a systemd service etc)

MatthewCroughan commented 1 year ago

@arcnmx It looks to me like anything that uses pytorch always has this models/ folder in common, I was thinking we could populate that by making symlinks to the nix store via systemd tmpfiles in a nixosModule, that's going to be what I attempt to do in the nixified.ai repo when the time comes.

gbtb commented 1 year ago

Released 2.0.0 version